Emacs の CVS HEAD で M+ フォントを使いたいのだけれど、 Emacs が SEGV を出してしんでしまう。
gdb で追ってみたけど、理解できない挙動をしてくれる…。
cheekcat@yue 1115 % gdb --directory=/usr/portage/distfiles/cvs-src/emacs/src/ --args emacs -Q
GNU gdb 6.8
Copyright (C) 2008 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law. Type "show copying"
and "show warranty" for details.
This GDB was configured as "x86_64-pc-linux-gnu"...
(gdb) r
Starting program: /usr/bin/emacs -Q
[Thread debugging using libthread_db enabled]
[New Thread 0x7f5ea63e86f0 (LWP 6253)]
Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0x7f5ea63e86f0 (LWP 6253)]
0x00000000004dd01e in realize_x_face (cache=0x183a220, attrs=0x7fffae415a30) at xfaces.c:5837
5837 fontset = default_face->fontset;
まずは xfaces.c の realize_x_face() の一部から
.-------------------------------------------------------------------------------
| default_face = FACE_FROM_ID (f, DEFAULT_FACE_ID);
| if (default_face
| && lface_same_font_attributes_p (default_face->lface, attrs))
| {
| face->font = default_face->font;
| face->fontset
| = make_fontset_for_ascii_face (f, default_face->fontset, face);
| }
| else
| {
| /* If the face attribute ATTRS specifies a fontset, use it as
| the base of a new realized fontset. Otherwise, use the same
| base fontset as of the default face. The base determines
| registry and encoding of a font. It may also determine
| foundry and family. The other fields of font name pattern
| are constructed from ATTRS. */
| int fontset = face_fontset (attrs);
|
| /* If we are realizing the default face, ATTRS should specify a
| fontset. In other words, if FONTSET is -1, we are not
| realizing the default face, thus the default face should have
| already been realized. */
| if (fontset == -1)
| fontset = default_face->fontset;
| if (fontset == -1)
| abort ();
| if (! FONT_OBJECT_P (attrs[LFACE_FONT_INDEX]))
| attrs[LFACE_FONT_INDEX]
| = font_load_for_lface (f, attrs, attrs[LFACE_FONT_INDEX]);
| if (FONT_OBJECT_P (attrs[LFACE_FONT_INDEX]))
| {
| face->font = XFONT_OBJECT (attrs[LFACE_FONT_INDEX]);
| face->fontset = make_fontset_for_ascii_face (f, fontset, face);
| }
| else
| {
| face->font = NULL;
| face->fontset = -1;
| }
| }
`-------------------------------------------------------------------------------
SEGV が出るのは次の行。
fontset = default_face->fontset;
ここで、 default_face == 0 なので SEGV が出ている。 関係する変数を見て
おこう。
(gdb) print default_face
$1 = (struct face *) 0x0
(gdb) print fontset
$2 = -1
(gdb) print attrs
$3 = (Lisp_Object *) 0x7fffae415a30
さて、次に face_fontset() を見てみる。
xfaces.c face_fontset()
.-------------------------------------------------------------------------------
| static int
| face_fontset (attrs)
| Lisp_Object *attrs;
| {
| Lisp_Object name;
|
| name = attrs[LFACE_FONTSET_INDEX];
| if (!STRINGP (name))
| return -1;
| return fs_query_fontset (name, 0);
| }
`-------------------------------------------------------------------------------
ということで、 face_fontset (attrs) == -1 なのだから !STRINGP(name)
0 が成り立つはずである。
ところが
(gdb) print attrs[LFACE_FONTSET_INDEX]
$4 = 22619155
(gdb) print Fstringp(attrs[LFACE_FONTSET_INDEX])
$5 = 11844081
(gdb) print Qt
$6 = 11844081
(gdb) print Qnil
$7 = 11843985
と、 Fstringp(attrs[LFACE_FONTSET_INDEX]) は Qt を返しているのだ。
一応正確を期すために Fstringp() を見ておこう。
DEFUN ("stringp", Fstringp, Sstringp, 1, 1, 0,
doc: /* Return t if OBJECT is a string. */)
(object)
Lisp_Object object;
{
if (STRINGP (object))
return Qt;
return Qnil;
}
このように STRINGP(attrs[LFACE_FONTSET_INDEX]) 0 が成り立っているし、
同時に!STRINGP(attrs[LFACE_FONTSET_INDEX]) 0 も成り立っているように
思えるのである。
どこかにミスがあるのだろうが、どこにあるのだろう?