TOP > 外国特許検索 > THREE-DIMENSIONAL MEASUREMENT SYSTEM, THREE-DIMENSIONAL MEASUREMENT METHOD, AND THREE-DIMENSIONAL MEASUREMENT PROGRAM

THREE-DIMENSIONAL MEASUREMENT SYSTEM, THREE-DIMENSIONAL MEASUREMENT METHOD, AND THREE-DIMENSIONAL MEASUREMENT PROGRAM

外国特許コード F170008945
整理番号 (S2015-1503-N0)
掲載日 2017年2月1日
出願国 世界知的所有権機関(WIPO)
国際出願番号 2016JP065084
国際公開番号 WO 2016186211
国際出願日 平成28年5月20日(2016.5.20)
国際公開日 平成28年11月24日(2016.11.24)
優先権データ
  • 特願2015-103366 (2015.5.21) JP
発明の名称 (英語) THREE-DIMENSIONAL MEASUREMENT SYSTEM, THREE-DIMENSIONAL MEASUREMENT METHOD, AND THREE-DIMENSIONAL MEASUREMENT PROGRAM
発明の概要(英語) A three-dimensional measurement system (100) for estimating the depth of a measurement object (4) on which a two-dimensional pattern is projected by projection light via a projection optical system, by comparing image data captured and obtained by a camera (2) and a reference image data group correlated with the depth of the measurement object (4) and acquired in advance. The two-dimensional pattern is projected on the measurement object (4) via a coded aperture pattern provided to the projection optical system. The depth of the measurement object (4) is estimated by image processing in which a stereo method for performing image matching based on a parallax of the projection optical system and an imaging means with respect to the measurement object (4), and a DfD method based on blur in the optical axis direction of the coded aperture pattern in the image data of the measurement object (4) are both applied.
従来技術、競合技術の概要(英語) BACKGROUND ART
In recent years, in various fields for measuring a three-dimensional shape of the object 3 has been actively performed. Among them, from the viewpoint of the simplicity and cost of the structured light (pattern light) and a camera using an active 3-dimensional measurement method is the mainstream. Is an active 3-dimensional measurement method, the measuring object to the object in the projected pattern light, the light reflected at the surface of the object observed by the camera. Further, the original pattern projected by the pattern light and the object are observed on the correspondence relationship between the triangulation of the object in the three-dimensional shape 3 (in the focus direction of the depth) of the captured image is reconstructed from the camera. The calculation time of the correspondence relationship for efficiency, video projector 2 is projected onto a two-dimensional pattern light have been proposed many techniques (for example, see Non-Patent Document 1).
However, the depth of field is shallow and a video projector, a measurement range of the depth to the focus direction of the restriction exists. In order to overcome the constraints of the measurement range, the depth of field is a method of using a laser light source has been proposed. However, in the case of using a laser beam source, a special optical system is required and therefore, the optical system to the application that it is difficult to construct.
To eliminate the narrow measurement range of the other methods, the blurring of the observed pattern DfD(Depth from Defocus) estimating the depth from which has been proposed (for example, see Non-Patent Document 2). DfD method, on the premise that the blurring of the observed pattern and therefore, less constrained by the depth of field. However, the measuring apparatus of this approach for using the LED array light source, a fine pattern cannot be projected onto the object, the resulting depth map is sparse to a disadvantage. That is, approach DfD, the pattern for the change in the depth of the change is limited by the aperture of a lens, an inconvenience that the depth resolution. In addition, in the estimation of the depth of the deconvolution requires a great amount of processing, an enormous amount of calculation is also a disadvantage.
DfD method, the camera is generally known as a method based on the blurring of the image, the conditions of the same from one image 1 that is capable of estimating the depth. However, for better DfD method is, the high frequency to a target object is assumed to be because the texture is present, can be applied to the actual range is required. Therefore, the projected light pattern and the imaging camera blurring of both the defocus of the mathematical modeling, DfD method for realizing the measurement distance in real time has been proposed (for example, see Non-Patent Document 3). This is, a checkerboard pattern can be projected, the blur of a photographing result and a method of performing DfD from, according to this method, there is no texture on the measurement object can be measured. However, in the technique described in Non-Patent Document 3, two images having different focusing distance 2 is required, a prism between the lens and the imaging device are placed, the imaging optical system must be devised.
In addition, analyzing the blurring of image of a projector, a method of estimating the depth of the projection plane has been proposed (for example, see Non-Patent Document 4). According to this method, by shifting the phase of the line-shaped projection pattern while the image may be acquired, the depth of each pixel of the captured image can be estimated. However, in this method, the projection pattern is changed due to the need for a plurality of images, it is difficult to measure the depth of the object is moving at a disadvantage.
In addition, the projection optical system of the projector the coded aperture can be attached to the structured light generated in the three-dimensional measurement method using the disclosed 3 (for example, see Non-Patent Document 5). In this method, a code pattern in a lattice shape with a coded aperture is provided with the light source is projected onto the pattern, as observed on the object to be measured and the blurred degree of the projection pattern is used, by determining the distance of each point DfD.
Non-patent document 5 disclosed method, the level of blur of the projection pattern, that is the point spread function (Point Spread Function, hereinafter, also referred to as the PSF) that define the scale of the scale parameter is used as the parameter. First, a plurality of known depths in the blur image is used to actually observed, the scale of the PSF in depth of each of the obtained, accurate scale parameter is obtained by fitting the calibration is performed. Further, the actual structured light projected onto the object by imaging the projection pattern, a projection pattern obtained by imaging the image data, the calibration according to the parameters thus obtained, using the PSF varies depending on the depth of the computation of the deconvolution, each arranged in a grid pattern to restore the code pattern of the coded aperture no blurring the depth of the most similar to the, is obtained as the estimation result.
However, the non-patent document 5 in the method disclosed in, for calculation of similarity calculation and deconvolution is carried out for the depth of all, a large calculation amount of computation of the deconvolution a huge number of times, the calculation time becomes long. In addition, also occur as a result of the calculation is unstable.
In addition, the Takeda et al., in a simple stereo method, the coded aperture by the introduction of the degradation of precision due to blurring can be suppressed, and further fuzed DfD method and a stereo method was proposed (for example, Non-Patent Document 6, 7).
  • 出願人(英語)
  • ※2012年7月以前掲載分については米国以外のすべての指定国
  • KAGOSHIMA UNIVERSITY
  • HIROSHIMA CITY UNIVERSITY
  • 発明者(英語)
  • KAWASAKI Hiroshi
  • HORITA Yuuki
  • ONO Satoshi
  • HIURA Shinsaku
  • FURUKAWA Ryo
国際特許分類(IPC)
指定国 National States: AE AG AL AM AO AT AU AZ BA BB BG BH BN BR BW BY BZ CA CH CL CN CO CR CU CZ DE DK DM DO DZ EC EE EG ES FI GB GD GE GH GM GT HN HR HU ID IL IN IR IS JP KE KG KN KP KR KZ LA LC LK LR LS LU LY MA MD ME MG MK MN MW MX MY MZ NA NG NI NO NZ OM PA PE PG PH PL PT QA RO RS RU RW SA SC SD SE SG SK SL SM ST SV SY TH TJ TM TN TR TT TZ UA UG US UZ VC VN ZA ZM ZW
ARIPO: BW GH GM KE LR LS MW MZ NA RW SD SL SZ TZ UG ZM ZW
EAPO: AM AZ BY KG KZ RU TJ TM
EPO: AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR
OAPI: BF BJ CF CG CI CM GA GN GQ GW KM ML MR NE SN ST TD TG
特許の内容に興味を持たれた方、ライセンスをご希望の方は、下記「問合せ先」までお問い合わせください。

PAGE TOP

close
close
close
close
close
close