TOP > 外国特許検索 > System that assists in observing a luminal organ using the structure of the luminal organ

System that assists in observing a luminal organ using the structure of the luminal organ

外国特許コード F120006084
整理番号 S2008-0104
掲載日 2012年1月6日
出願国 アメリカ合衆国
出願番号 22692207
公報番号 20090161927
公報番号 8199984
出願日 平成19年2月17日(2007.2.17)
公報発行日 平成21年6月25日(2009.6.25)
公報発行日 平成24年6月12日(2012.6.12)
国際出願番号 JP2007052894
国際公開番号 WO2007129493
国際出願日 平成19年2月17日(2007.2.17)
国際公開日 平成19年11月15日(2007.11.15)
優先権データ
  • 特願2006-128681 (2006.5.2) JP
  • 2007WO-JP52894 (2007.2.17) WO
発明の名称 (英語) System that assists in observing a luminal organ using the structure of the luminal organ
発明の概要(英語) (US8199984)
Medical image observation assisting system 1 including CT-image-data retrieving portion 10, CT-image-data storing portion 11, information extracting portion 12, anatomical information DB13, point of view/line of view setting portion 14, luminal organ image generating portion 15, anatomical nomenclature information generating portion 16, branch specifying portion 17, image synthesizing and displaying portion 18 and user I/F control portion 19.
The point of view/line of view setting portion 14 sets a point of view and line of view for observing an external profile of a luminal organ, on the basis of structure information of the luminal organ extracted by the information extracting portion 12, while a point of interest is kept substantially on a centerline of the organ.
特許請求の範囲(英語) [claim1]
1. A medical image observation assisting system, comprising: a volume-region setting portion configured to sequentially set volume regions each enveloping a part of a luminal organ extending within a subject body, on the basis of three-dimensional image data of the subject body, such that the volume regions are adjacent to each other;
a luminal-organ-region-information calculating portion configured to repeatedly calculate luminal region data in the form of region information of said part of the luminal organ within each of said volume regions set by said volume-region setting portion, on the basis of the three-dimensional image data of the part of the luminal organ within said volume region;
a luminal-organ-structure-information calculating portion configured to calculate luminal structure data in the form of structure information of the part of the luminal organ within the volume region for which said luminal region data has been calculated by said luminal-organ-region-information calculating portion;
a virtual-centerline generating portion configured to generate a virtual centerline extending in a longitudinal direction of said luminal organ, on the basis of said luminal structure data;
a virtual-image generating portion configured to generate a virtual image of said luminal organ along said virtual centerline;
a display portion configured to display said virtual image of said luminal organ;
an observing-position specifying portion configured to determine an observing position for generating said virtual image, on the basis of at least one of said virtual centerline, said luminal region data and said luminal structure data, such that a region of said luminal organ displayed on said display portion has a desired size, and for moving said observing position in the longitudinal direction of said luminal organ, on the basis of said virtual centerline or said luminal structure data;
an anatomical-structure-information storing portion configured to store anatomical structure information including at least anatomical nomenclature information; and
an anatomical-nomenclature correlating portion configured to correlate the anatomical nomenclature information stored in said anatomical-structure-information storing portion, with said luminal structure data,
and wherein said virtual-image generating portion changes a method of processing said virtual image of the luminal organ, on the basis of said anatomical structure information or said luminal structure data.
[claim2]
2. The medical image observation assisting system according to claim 1, further comprising an image synthesizing portion configured to display anatomical nomenclature information of said luminal organ on the virtual image displayed on said display portion, on the basis of said anatomical nomenclature information which is correlated with said luminal structure data, by said anatomical-nomenclature correlating portion.
[claim3]
3. The medical image observation assisting system according to claim 2, wherein said image synthesizing portion displays a real endoscope image taken by an endoscope actually inserted into said luminal organ of said subject body and said virtual image which corresponds to said real endoscope image and which is generated by said virtual-image generating portion such that said real endoscope image and said virtual image can be compared with each other.
[claim4]
4. The medical image observation assisting system according to claim 2, wherein said image synthesizing portion displays said anatomical nomenclature of said luminal organ on a real endoscope image displayed on said display portion, on the basis of correlation of said anatomical nomenclature by said anatomical-nomenclature correlating portion, said real endoscope image being taken by an endoscope actually inserted into said luminal organ of said subject body.
[claim5]
5. The medical image observation assisting system according to claim 3, further comprising: a navigating portion configured to display an image for navigating a path from a position of insertion of an endoscope into said luminal organ to a target portion of the luminal organ,
and wherein said navigating portion displays an indication of one of a plurality of branches of said luminal organ open at a bifurcated portion thereof indicated on the image displayed on said display portion, said endoscope being advanced into said one of the plurality of branches.
[claim6]
6. The medical image observation assisting system according to claim 3, further comprising: a navigating portion configured to display an image for navigating a path from a position of insertion of an endoscope into said luminal organ to a target portion of the luminal organ,
and wherein said navigating portion automatically generates said path, and displays a plurality of anatomical names correlated by said anatomical-nomenclature correlating portion with respective organs of the luminal organs defining said path, in the order from said position of insertion of the endoscope to said target portion.
[claim7]
7. The medical image observation assisting system according to claim 5, wherein when said extraluminal tissue is set as a pseudo-target portion, said navigating portion sets a portion of said luminal organ which is located close to an extraluminal tissue existing outside said luminal organ in said subject body and into which said endoscope can be inserted, as said target portion of said luminal organ.
[claim8]
8. The medical image observation assisting system according to claim 1, further comprising: a virtual-image storing portion configured to store each of the plurality of virtual images generated by said virtual-image generating portion, which each virtual image includes a bifurcated portion of said luminal organ, such that said each virtual image is correlated with said luminal structure data corresponding to said each virtual image; and
a second real-image observing-position estimating portion configured to extract features which appear on a real endoscope image taken by an endoscope actually inserted into said luminal organ of said subject body and which correspond to the luminal structure data, verifying the extracted features against or with respect to the luminal structure data stored in said virtual-image storing portion, and estimating the observing position of the virtual image corresponding to the luminal structure data verified to match the extracted features, as a real-image observing position which is a position of the leading end portion of said endoscope within said luminal organ.
[claim9]
9. The medical image observation assisting system according to claim 8, wherein said virtual-image generating portion generates said virtual image such that said real-image observing position estimated by said second real-image observing-position estimating portion is determined as said observing position of the virtual image.
[claim10]
10. The medical image observation assisting system according to claim 8, wherein said virtual image and said real endoscope image corresponding to said luminal structure data have at least one feature selected from the number of luminally structural portions, the positions of the luminally structural portions, and the luminosity of the image of the luminally structural portions.
[claim11]
11. The medical image observation assisting system according to claim 8, wherein said virtual-image storing portion comprises a virtual-image learning portion configured to implement learning modification of contents of said virtual-image storing portion, on the basis of a result of the verification of the extracted features with respect to the luminal structure data by the second real-image observing-position estimating portion.
[claim12]
12. A medical image observation assisting system, comprising: a volume-region setting portion configured to sequentially set volume regions each enveloping a part of a luminal organ extending within a subject body, on the basis of three-dimensional image data of the subject body, such that the volume regions are adjacent to each other;
a luminal-organ-region-information calculating portion configured to repeatedly calculate luminal region data in the form of region information of said part of the luminal organ within each of said volume regions set by said volume-region setting portion, on the basis of the three-dimensional image data of the part of the luminal organ within said volume region;
a luminal-organ-structure-information calculating portion configured to calculate luminal structure data in the form of structure information of the part of the luminal organ within the volume region for which said luminal region data has been calculated by said luminal-organ-region-information calculating portion;
a virtual-centerline generating portion configured to generate a virtual centerline extending in a longitudinal direction of said luminal organ, on the basis of said luminal structure data;
a virtual-image generating portion configured to generate a virtual image of said luminal organ along said virtual centerline;
a display portion configured to display said virtual image of said luminal organ;
an observing-position specifying portion configured to determine an observing position for generating said virtual image, on the basis of at least one of said virtual centerline, said luminal region data and said luminal structure data, such that a region of said luminal organ displayed on said display portion has a desired size, and for moving said observing position in the longitudinal direction of said luminal organ, on the basis of said virtual centerline or said luminal structure data;
an endoscope-position detecting portion configured to detect a relative position of a leading end portion of an endoscope actually inserted into said luminal organ of said subject body; and
a first real-image observing-position estimating portion configured to calculate a transformation matrix by comparing the relative position of the leading end portion of the endoscope detected by said endoscope-position detecting portion, with said organ structure data, and converting the relative position of the leading end portion of the endoscope according to the transformation matrix, to thereby estimate a real-image observing position which is a position of the leading end portion of said endoscope within said luminal organ.
[claim13]
13. The medical image observation assisting system according to claim 12, wherein said virtual-image generating portion generates said virtual image such that said real-image observing position estimated by said first real-image observing-position estimating portion is determined as said observing position of the virtual image.
[claim14]
14. A medical image observation assisting system, comprising: a volume-region setting portion configured to sequentially set volume regions each enveloping a part of a luminal organ extending within a subject body, on the basis of three-dimensional image data of the subject body, such that the volume regions are adjacent to each other;
a luminal-organ-region-information calculating portion configured to repeatedly calculate luminal region data in the form of region information of said part of the luminal organ within each of said volume regions set by said volume-region setting portion, on the basis of the three-dimensional image data of the part of the luminal organ within said volume region;
a luminal-organ-structure-information calculating portion configured to calculate luminal structure data in the form of structure information of the part of the luminal organ within the volume region for which said luminal region data has been calculated by said luminal-organ-region-information calculating portion;
a virtual-centerline generating portion configured to generate a virtual centerline extending in a longitudinal direction of said luminal organ, on the basis of said luminal structure data;
a virtual-image generating portion configured to generate a virtual image of said luminal organ along said virtual centerline;
a display portion configured to display said virtual image of said luminal organ;
an observing-position specifying portion configured to determine an observing position for generating said virtual image, on the basis of at least one of said virtual centerline, said luminal region data and said luminal structure data, such that a region of said luminal organ displayed on said display portion has a desired size, and for moving said observing position in the longitudinal direction of said luminal organ, on the basis of said virtual centerline or said luminal structure data; and
an extraluminal tissue extracting portion configured to extract extraluminal tissue structure information relating to a structure of an extraluminal tissue existing outside said luminal organ in said subject body, on the basis of said three-dimensional image data,
and wherein said virtual-image generating portion commands said display portion to display the virtual image of said luminal organ and a virtual image of said extraluminal tissue within the same screen in the same scale while maintaining an actual positional relationship between the virtual images.
[claim15]
15. The medical image observation assisting system according to claim 14, further comprising: an anatomical-structure-information storing portion configured to store anatomical structure information including at least anatomical nomenclature information for said luminal organ and at least an anatomical number for said extraluminal tissue; and
an anatomical-nomenclature correlating portion configured to correlate the anatomical nomenclature information of said luminal organ stored in said anatomical-structure-information storing portion, with said luminal structure data, and for correlating the anatomical number of said extraluminal tissue stored in the anatomical-structure-information storing portion, with said extraluminal tissue structure information.
[claim16]
16. The medical image observation assisting system according to claim 15, further comprising: an image synthesizing portion configured to display anatomical name of said luminal organ and the anatomical number of said extraluminal tissue on the virtual images displayed on said display portion, on the basis of said anatomical nomenclature information and said anatomical number which are correlated with said luminal structure data and said extraluminal tissue structure information by said anatomical-nomenclature correlating portion.
[claim17]
17. The medical image observation assisting system according to claim 15, wherein said virtual-image generating portion changes an image processing method on the basis of at least one of said anatomical structure information, said luminal structure data and said extraluminal tissue structure information.
[claim18]
18. The medical image observation assisting system according to claim 14, wherein said extraluminal tissue is a lymph node, while said luminal organ is a blood vessel.
  • 発明者/出願人(英語)
  • MORI KENSAKU
  • KITASAKA TAKAYUKI
  • DEGUCHI DAISUKE
  • NAGOYA UNIVERSITY
国際特許分類(IPC)
米国特許分類/主・副
  • A61B006/46B10
  • G06T007/00P1E
  • S06T207/10081
  • S06T207/30061
名古屋大学の公開特許情報を掲載しています。ご関心のある案件がございましたら、下記まで電子メールでご連絡ください。

PAGE TOP

close
close
close
close
close
close