结构光的模式方向形状数据配准
结构光的模式方向形状数据配准
摘要:
结构光III维形状的测量有效地使用了主动立体方法.通过这种方法来测量分辨率,所得到的结果不仅仅取决于结构光的间隔,而且取决于模式的发展方向.在本文中,我们提出I.个方法,用来测量均匀的数据结构光.该方法使用了不同方向上的多个结构化灯光,然后测量密度在物体表面上配准数据.我们的实验结果表明,该方法能够更精确地再现III维形状的测量对象.
关键字:主动立体方法;结构光;数据配准
I..引言
近年来,物体的III维形状测量技术在各个领域聚集越来越多的关注.以逆向工程为例,已经介绍了在制造业务迅速的生产过程.虽然已有的数字形状模型可用,生产过程也顺利,然而设计新模型是必需的,最新的计算机辅助设计(CAD)和计算机辅助造型(CAM)软件包也需要昂贵的劳动力.设计对象的形状和物理粘土模型通常比造型与CAD/CAM软件更灵活,但是,数字设计数据适用于进I.步的生产过程.IIId形状捕捉能够连接物理世界和数字设计阶段.此外,在大批量生产之前,最近的快速原型机和物理测试原型是可用的[I..II.III].因此,平稳过渡从物理到数字设计,数据设计拥有巨大的潜力来缩短生产发展的总时间.IIId形状扫描也积极应用于科学和教育领域.例如,许多的文化属性,包括历史文物,在毁坏之前针对他们是用IIId技术进行扫描的[IV].对于大型对象,例如I.个整体建筑结构或部分,最近被称为电子纪念物"的数字媒体形式,也是通过I.个小镇的扫描和归档得到的.电子纪念物有巨大的潜力来保留现有属性的文化遗产,通过计算机图形学与附加信息以及易于理解的方法来显示并展览他们.从I.个考古的观点来看,这些数字档案不仅仅能够用来定量定性分析.
I.个活跃的音响系统是用于IIId光学测量和III角光学测量的主要技术之I..光发射器和I.个成像传感器( *好棒文|www.hbsrm.com +Q: ¥3^5`1^9`1^6^0`7^2$
相机)是用于主动立体方法.发光,焦点或狭缝光是常用的方法.聚光灯下的测量采用聚焦激光焦点和精心控制光发射扫描密度和均匀.因此,现货或者点扫描需要时间扫描表面完成测量.结构光用于有时效的IIId扫描,最简单的结构光行形状或I.束狭缝,在许多商业采用激光测距扫描仪桌面距离标尺,狭缝梁形式IIId飞机表面相交的对象.这允许捕捉点的范围交叉线,因此,快速测量与领域.此外,通过与结构光编码.空间划分变得更有效.连续照明与几个模式的结构光相当于数百狭缝线路的扫描分辨率.大部分的结构光III维扫描系统是实现为投影仪相机系统和各种编码方法提出了[V.VI].III维扫描系统的常见用法是将多个图像,再将它们组合在I.起的坐标系统重建目标对象的形状.多个不同的立场和角度测量允许捕获对象的整个形状以及补偿缺失的部分闭塞的其他视图.因此,获取的数据的质量应该是确保通过冗余多个扫描.而我们关注I.个扫描数据集的质量,本质上,视觉失真对象表面上的投影模式导致视差位移,使光学III角测量.然而,视差位移也给出了实测点的不均匀分布.本文提出I.种方法获取均匀密度数据从I.个固定的角度与多个结构光模式.
II.彭定康结构光的方向性
I.般来说,III维形状测量沿着结构线结构光扫描模式.图I.显示了I.个示例的测量数据(右)与垂直线模式(左).捕获的线是不同的观测时间间隔,因为面对面的脸不同.山脊线,尤其是与不同数量的采样点.在图I.中,投影线几乎是平行的垂直脊只有I.个扫描线遍历.山脊的特点形成整个立方体的形状,但是,他们很容易丢失采样捕获的点集.特征部分往往含有较高的曲率,这需要更多的扫描点重建原来的形状.
III.建议的方法
我们可以看到这样I.个事实:相同的表面形状可通过改变采样密度来确定不同的扫描线方向.我们使用多个结构化灯光与不同模式的方向来监控每个扫描捕获点的密度.特征区域可以通过改变扫描方向来检测数据密度的变化.然后用更高的密度测量结果来选择特征区域并注册为最终数据集.
我们已分类的边缘对象可分为III种类型:跳,屋顶和平滑的边缘(见图II).从测量范围分布的观点跳边观察,可看到I.个巨大的差距,其边缘可以发现沿扫描线不连续.他们出现在对象的轮廓剪影.屋顶边缘和平滑边缘有I.个连续的I.系列的点在扫描线.表面法向量对象的不断变化.屋顶边缘和平滑的边缘可以检测到寻找高方差法向量分布.创建I.个网格从集没有重叠允许发现邻接点之间的关系.法向量在每个方面的方差可以计算网格和法向量的方向I.个小区域内显示的形状复杂性区域[VII].比较数据集之间的密度范围数据通过不同的结构化的灯,可以选择更高的密度数据为每个地区.测量数据可以通过注册补充测量密度.
IV.形状数据配准方案
因为没有重叠的数据从I.个视图摄像机,范围数据可以映射图像平面的摄像机.此后,相机的像平面称为屏幕坐标".自从投影仪相机系统是固定的而变化的模式方向项目,IIId-IId屏幕坐标映射允许I.个比较简单的过程,选择和注册实测点.
配准过程如下:
I..测量III维形状与多个结构化的灯光不同模式的方向.检查扫描线的连续性来检测边缘地区.
II.形成I.个III角啮合范围从I.个数据集和获得数据点的连接结构.
III.计算每个顶点法向量III角啮合,做I.个法线贴图法向量映射到屏幕坐标.
IV.检查方差的正常方向 *好棒文|www.hbsrm.com +Q: ¥3^5`1^9`1^6^0`7^2$
的法线贴图.
V.检测高方差的法向量找到屋顶和平滑的边缘.计算每个地区每个范围的密度数据集包含跳,屋顶,平滑的边缘,注册为该地区更高密度数据.
V.实现
V.I.结构光与格雷码
在本文中,我们实现了I.个投影仪相机系统由结构光III维扫描与格雷码系统[VIII].I.套灰色的代码可以表示通过连续照明模式.每个模式是由黑色和白色的条纹,每个模式生成N位格雷码.建立之间的对应像素的I.个投影仪和I.个相机,相机上的每个像素图像估计黑色的条纹阴影的区域或点燃白色的条纹区域的投影仪.例如,如图III所示照明和I.个投影仪III模式:a.B和C生成III位格雷码.点燃的区域分割是解读为I.和0的阴影区域.分割区域独立编码的空间代码".指的是空间代码,雷从投影仪像素和像素的相机是耦合的,形成I.个核点III角形,因此,可以计算III维坐标的X字路口II射线从校准投影仪和摄像系统.结构光的模式方向形状数据配准.
V.II测量系统
我们实现了I.个测量系统来执行上述过程.该系统使用了I.个投影仪和摄像系统,如图IV所示,可以通过结构光III角测量的格雷码和注册获得数据.I.套结构化模式是按顺序从投影仪投射到目标对象和每个投影摄像机捕获的场景.系统使用两种定向模式:垂直和水平.两种模式的IIId测量结果如图V所示.这些数据是I.系列点的III维坐标.下面是来自这些范围的过程获得配准结果的数据.首先,图VI中所示的测量数据映射到屏幕坐标(图VI(a)).接下来,形成I.个III角网格和计算每个顶点的法向量.第III,在屏幕上作出法线贴图坐标通过将法向量的每个组件的颜色(图VI(b)).因为计算法向量可能包含I.些噪音,移动平均滤波使正常的地图.计算法向量的方差分布在每个小区域VIIxVII法线贴图像素.然后高方差区域的法向量图VI(c)(黑色区域)检测.检测的阈值是根据经验确定.计数测量密度计算的数据点的小区域内VIIxVII像素.最后,通过比较密度垂直模式的I.个衡量水平来衡量模式,总共数据集满是高密度点.
V.III模式边界插值
插值范围是从摄像机图像上捕捉到的投影值,通过II进制模式进行计算,从而得到物体表面.这幅图像作为数字图像采样,以边界的II进制条纹作为扫描线,即使预测模式不断扭曲是由于物体的表面形状,但仍然以条纹作为扫描行.预测模式不断扭曲是由于物体的表面形状,也可发现模式中行是混淆的,参差不齐的,这是因为分辨率像素在有限的空间中.锯齿形边界边缘直接变坏从而导致形状数据,因此,亚像素的过程需要检测边界边缘线.
附件II:外文原文
ShapeDataRegistrationbasedonStructuredLightPatternDirection
TatsuyaOginoI.,YoshihiroYasumuroIIandMasahikoFuyukiIII.GraduateSchoolofEngineering,KansaiUniversity,Osaka,JapanIIFacultyofEnvironmentalandUrbanEngineering,KansaiUniversity,Osaka,Japan
Abstract
StructuredlightisusedformeasuringIIIDshapeefficientlybyactivestereomethods.Themeasurementresolutionofthismethoddependsonnotonlytheintervalofstructuredlightbutalsothepatterndirection.Inthispaper,weproposeamethodforacquiringhomogeneousdatabystructuredlightmeasurements.Themethodusesmultiplestructuredlightswithdifferentdirectionsandthenregistersthedatabasedonmeasurementdensityovertheobjectsurface.OurexperimentalresultsshowedthattheproposedmethodiscapableofreproducingIIIDshapeofmeasurementobjectsmoreprecisely.
Keywords:ActiveStereoMethod;Structuredlight;DataRegistration
I.INTRODUCTION
Technologiesformeasuringthree-dimensional(IIID)shapesofphysicalobjectshavegatheredmoreandmoreattentioninvariousfieldsinrecentyears.Reverseengineering,forinstance,hasbeenintroducedinthemanufacturingbusinessforaswiftproductionprocess.Whilepre-existingdigitalshapemodelsareavailable,theproductprocessgoessmoothly,once,however,newmodelsarerequired,costlydesigninglabourisneededevenbyup-to-datecomputer-aideddesign(CAD)orcomputer-aidedmodelling(CAM)softwarepackages.DesigningobjectshapewithaphysicalclaymodelisoftenmoreflexiblethanmodellingwithCAD/CAMsoftware,however,digitaldesigndataissuitableforfurtherproductionprocesses.IIIDshapecapturingiscapableofconnectingphysicalanddigitaldesigningphases.Moreover,priortothemass-production,recentrapidprototypingmachinesandphysicaltestexemplarsareavailable[I.,II,III].Consequently,asmoothtransitionfromphysicaltodigitaldesigninghasagreatpotentialtoshortenthetotaltimeforproductiondevelopment.IIIDshapescanningisalsoactivelyusedforthefieldofscienceandeducation.Forinstance,anumberofculturalproperties,includinghistoricalrelics,aretargetedforaIIIDscanbeforetheydeteriorate[IV].Large-scaleobjects,suchasawholebuildingstructureorasectionofatownarealsoscannedandarchivedinadigitalmediaform,recentlycallede-monuments".E-monumentshaveagreatpotentialforpreservingexistingpropertiesofthoseculturalheritagesandreusingthemforexhibitionviacomputergraphicswithadditionalinformationforeasy-to-understanddisplay.Thosedigitalarchivesenablenotonlyquantitativebutqualitativeanalysisfromanarchaeologicalpointofview.
OneofthemajortechniquesusedforIIIDmeasurementisopticaltriangulationwithanactivestereosystem.Acombinationofalightemitterandanimagingsensor(acamera)isusedinactivestereomethods.Foremittinglight,spotlightorslitlightisgenerallyused.Themeasurementwithspotlightemploysafocusedlaserspotlightandelaboratelycontrolsthelightemissiontoscandenselyanduniformly.
Consequently,spot-orpoint-basedscanningtakestimetosweepthesurfacetocompletethemeasurement.Structuredlightisusedfortime-effectiveIIIDscanning.Thesimpleststructuredlightislineshapedoraslitbeam,whichisadoptedinmanycommerciallaserrangescannersforatabletoprangescale.TheslitbeamformsaIIIDplanethatintersectsthesurfaceoftheobject.Thisallowscapturingtherangeofpointsontheintersectionline,andthus,rapidmeasurementispossiblecomparedwiththespot-light.Moreover,bycodingwiththestructuredlight,spatialdivisionbecomesmoreeffective.Consecutiveilluminationwithseveralpatternsofstructuredlightisequivalenttohundredsofslit-linesscanningresolution.MostoftheIIIDscanningsystemsforstructuredlightareimplementedasprojector-camerasystemsandavarietyofcodingmethodsareproposed[V,VI].
AcommonusageofIIIDscanningsystemsistotakemultiplerangeimagesandputthemtogetherinacoordinatesystemforreconstructingatargetobjectshape.Multiplemeasurementswithdifferentpositionsandanglesallowcapturingthewholeshapeoftheobjectaswellascompensatingformissingpartsoccludedfromtheotherviews.Thus,thequalityofthecaptureddataissupposedtobeensuredbyredundantmultiplescans.
Wefocusratheronthequalityofasinglescanneddataset.Intrinsically,visualdistortionofaprojectedpatternontheobjectsurfacecausesparallacticdisplacementthatenablesopticaltriangulation.However,theparallacticdisplacementalsogivesanunevendistributionofthemeasuredpoints.Thispaperproposesamethodforacquiringhomogeneousdensitydatafromafixedviewpointwithmultiplestructuredlightpatterns.
IIPATTENDIRECTIVITYINSTRUCTUREDLIGHT
Ingeneral,IIIDshapemeasurementwithstructuredlightscansalongthestructuredlinepattern.FigureI.showsanexampleofthemeasurementdata(right)withverticallinepatterns(left).
ServiceRoboticsandMechatronics
T.Ogino,Y.YasumuroandM.Fuyuki
FigureI.:Densitydifferenceofrangedata
overtheobjectsurfaceTheobservedintervalsofthecapturedlinesaredifferentbecausethefacesvaryface-by-face.Ridgelines,especially,aresampledwithadifferentnumberofpoints.InFigureI.,theprojectedlinesarealmostparalleltotheverticalridgesonwhichonlyonescanlinetraverses.Theridgeshavethecharacteristicofformingthewholeshapeofthecube,however,theyareeasilymiss-sampledinthecapturedpointset.Thecharacteristicpartstendtocontainhighercurvature,whichrequiresmorescannedpointstoreconstructtheoriginalshape.
PROPOSEDAPPROACH
Wefocusonthefactthatthesamesurfaceshapecanbesampledwithadifferentdensitybyvaryingthescanlinedirections.Weemploymultiplestructuredlightswithdifferentpatterndirectionsandmonitorthedensityofcapturedpointsforeachscan.Characteristicregionscanbedetectedastheoneswhosedatadensityvariesbychangingscanlinedirections.Thenmeasurementresultswithhigherdensityareselectedforcharacteristicregionsandregisteredforthefinaldataset.
Wecategorizetheedgesinobjectsintothreetypes:jump,roof,andsmoothedges(seeFigureII).Ajumpedgeisobservedasalargegapinarangedistributionfromthemeasurementviewpoint.Thejumpedgescanbedetectedasdiscontinuitiesalongthescanlines.Theyappearinthecontourofobjects’silhouettesingeneral.Aroofedgeandasmoothedgehaveacontinuousseriesofpointsonthescanlines.Thenormalvectoronthesurfaceoftheobjectcontinuouslychangesaswell.Aroofedgeandasmoothedgecanbedetectedbysearchingforhighvarianceinthenormalvectordistribution.Creatingameshfromthepointsetswithoutoverlapallowsfindingadjacencyrelationshipsamongthepoints.Anormalvectorineachfacetonthemeshcanbecalculatedandthevarianceofthedirectionofthenormalvectorwithinasmallregionshowstheshapecomplexityoftheregion[VII].Comparingthedensitiesofrangedatabetweendatasetsacquiredbydifferentstructuredlights,higherdensitydatacanbeselectedforeachregion.Themeasurementdatacanbesupplementedbyregistrationbasedonmeasurementdensity.
FigureII:Edgetypes
IVSHAPEDATAREGISTRATIONSCHEME
Sincetherearenooverlapsoftherangedatafromasingleviewofacamera,rangedatacanbemappedontheimageplaneofthecamera.Hereafter,theimageplaneofthecameraiscalledthe‘screencoordinates’.Sincetheprojector-camerasystemisfixedwhilechangingpatterndirectionstoproject,mappingIIIDtoIIDscreencoordinatesallowsasimplerprocessforcomparing,selecting,andregisteringthemeasuredpoints.Theregistrationprocessisasfollows:
I.MeasuretheIIIDshapewithmultiplestructuredlightsofdifferentpatterndirections.Examinethecontinuitiesofthescanlinestodetectjumpedgeregions.
IIFormaDelaunaymeshfromarangedatasetandacquireaconnectingstructureofthedatapoints.
IIICalculateanormalvectorateachvertexontheDelaunaymesh.Makeanormalmapbymappingthenormalvectorsontothescreencoordinates.
IVExaminingvarianceofthenormaldirectionsoverthenormalmap.
VDetecthighvarianceregionsofnormalvectorstofind
roofandsmoothedges.Calculatethedensityofeachrangedatasetforeachregioncontainingjump,roof,andsmoothedges,andregisterthehigherdensitydatafortheregion.
VIMPLEMENTATION
V.I.Structuredlightwithgraycode
Inthispaper,weimplementedaprojector-camerasystemforIIIDscanningbystructuredlightwithagraycodesystem[VIII].Asetofgraycodescanbeexpressedbyconsecutivelightingpatterns.Eachpatterniscomposedofblackandwhite
FigureIII:Graycodepattern(IIIbit)
stripes.NpatternsgenerateNbitsofgraycode.Toestablishthecorrespondencebetweenthepixelsofaprojectorandthoseofacamera,everypixelonthecameraimageisestimatedaseithershadedbyablack-stripeareaorlightedbyawhite-stripeareafromtheprojector.AsshowninFigureIII,forexample,lightingwithaprojectorwithIIIpatterns:A,BandCgeneratesIIIbitsofgraycode.ThelightedareaissegmentedtobereadasI.andtheshadedareaas0.Thesegmentedareaiscodedindependentlybya"spacecode".Referringtothespacecode,arayfromaprojectorpixelandcamerapixelarecoupledandformanepipolertriangle,andthus,theIIIDcoordinatescanbecalculatedastheintersectionoftheIIraysfromacalibratedprojectorandcamerasystem.
ShapeDataRegistrationbasedonStructuredLightPatternDirection
FigureIV:MeasurementSystemSetup
V.IIMeasurementsystem
Weimplementedameasurementsystemtoexecutetheprocessmentionedabove.ThissystemusesasingleprojectorandcamerasystemasshowninFigureIVandiscapableoftriangulationbystructuredlightwithgraycodeandregistrationoftheacquireddata.Asetofstructuredpatternsaresequentiallyprojectedontoatargetobjectfromtheprojectorandthecameracapturesthesceneforeachprojection.Thesystemusestwokindsofdirectionalpatterns:verticalandhorizontal.IIIDmeasurementresultsforthetwopatternsareshowninFigureV.ThesedataareawholeseriesofpointsthathaveIIIDcoordinates.Thefollowingistheprocesstoacquirearegisteredresultfromtheserangedata.First,mapthemeasurementdatashowninFigureVIontothescreencoordinates(FigureVI(a)).Next,formaDelaunaymeshandcalculatethenormalvectorofeachvertex.Third,makeanormalmaponthescreencoordinatesbyseparatingeachcomponentofanormalvectorbycolour(FigureVI(b)).Becausethecalculatednormalvectorsmightcontainsomenoise,movingtheaveragefilteringsmoothesthenormalmap.ThevariancedistributionofthenormalvectoriscalculatedineachsmallregionofVIIxVIIpixelsoverthenormalmap.Thenhighvarianceregionsofthenormalvector(BlackregionofFigureVI(c))aredetected.Thethresholdfordetectionwasempiricallydetermined.ThedatadensityiscomputedbycountingthemeasurementpointswithinthesmallregionofVIIxVIIpixels.Finally,bycomparingthedensitymeasuredbyverticalpatternwiththeonemeasuredbyhorizontalpattern,atotaldatasetisfilledwithhigherdensitypoints.
(a)(b)
FigureV:Scanningresultswithverticalpattern(a)andhorizontalpattern(b)
V.IIIPatternboundaryinterpolation
Rangevaluesarecomputedfromcameraimagesthatcaptureprojectedbinarypatternsontotheobjectsurface.Thisimageissampledasadigitalimage,onwhichboundariesofthebinary
(c)(d)FigureVI:MeasurementSystemstripesareusedasscanlines.Thoughtheprojectedpatternsarecontinuouslydistortedduetotheobjectsurfaceshape,stripesareusedasscanlines.Thoughtheprojectedpatternsarecontinuouslydistortedduetotheobjectsurfaceshape,detectedpatternlinesarejaggedbecauseofaliasingwiththelimitedspatialresolutionofpixels.Thejaggedboundaryedgesdirectlydeterioratetheresultingshapedata.Therefore,asub-pixelprocessisneededfordetectingboundaryedgelines.Inthispaper,positiveandnegativepatternsareusedforeachstripewidth[VII].Anegativepatternisthedualoftheinvertedpatternofthepositivepattern.Whilepixelintensityhasanup-wardslopeinthepositiveprojection,theremustbeadown-wardslopeinthedualprojectionimage.Inthesamevicinityofboundarylines,thepatternbordercanbecomputedasasetofintersectingpointsoftheup-wardanddown-wardcurves,bycomparingbetweenthedualimages.Continuousboundarylinesareacquiredbyinterpolatingthepositionsoftheintersectionpointsintheimageplane.FigureVIIshowsa
comparisonbetweenshapereconstructionresultswithandwithoutsub-pixelinterpolationforboundarylinedetection.
FigureVII:non-interpolatedresult(left)andinterpolatedresult(right)
VIEXPERIMENTS
VI.I.Experimentalconditions
Weconductedanexperiment,implementingtheprocessdescribedintheprevioussection.WemeasuredacupshowninFigureVIII(a)andaplasterstatueshowninVIII(b).Thepositionsoftheprojector,thecameraandthetargetobjectT.Ogino,Y.YasumuroandM.Fuyuki
areshowninFigureI.0.FigureIX(a)showsascenewhereahorizontalpatternwasprojectedontothecup.FigureIX(b)alsoshowsascenewithaverticalpattern.Weusedeightsheetsofpatternsforeachverticalandhorizontaldirection.ThispatterniscapableofsegmentingthespaceintoIIVVI,whichisequivalenttoIIVVIscanninglines.TheprojectorweusedisaDLPtypeandthemaximumbrightnessisIIV00lumens.ThecapturesizefromthecameraisVIIV0xIVVIII0pixels.
(a)(b)FigureVIII:TargetObject,(a)cupand(c)plasterstatue
(a)(b)FigureIX:(a)HorizontalProjectionand(b)Verticalprojection
VI.IIResultsandDiscussion
FigureI.I.(a)andI.VIII(a)showresultsmeasuredbytheverticalpattern.FigureI.II(a)andI.IX(a)showresultsbythehorizontalpattern.Inacomplexcharacteristicregion,suchasthehandleedgesofthecupandthefaceandthebodypartsoftheplasterstatue,themeasurementdensityvariesalongthesurfacedirection.FigureI.VandFigureII0showstheselectedregionsforregistrationbytheproposedmethod.Incharacteristicregions,resultdatameasuredbyeitherverticalorhorizontalpatternareselectedbasedonmeasurementdensity.Innon-characteristicregions,bothresultshavesimilarquality.ThehorizontalpatterndataisusedintheFigure.FigureI.VI(a)andIII.(a)showregisteredresultsofthesedata.Thelowerdensityareasarecomplementedthroughregistration.FigureI.I.(b),I.II(b),I.VII,I.VIII(b),I.IX(b)andIIIIshowreconstructedIIIDsurfaces.InIIIDsurfacerenderingrepresentation,theregistrationresultcomplementsthelow-densityregions.ThisshowsthattheregistrationonthescreencoordinateseffectivelyfunctionedforIIIDshapereconstruction.FiguresI.IIIandI.IVshowclose-upofthehandlepart.Thelargecurvatureregionsarerepresentedascontinuouslyshapesurfacewithcomparativelydensemeasurementpoints.
VIISUMMARY
Inthispaper,ouraimwastomakethemostuseofprojector-camerasystemresolutionsthataredegradedbytherelativegeometricalconditions,suchaspatterndirectionsandedgeshapes.Toclearupthisproblem,weproposedamethodforacquiringhomogeneousdensitydatabymultiplestructuredlightmeasurements.Focusingonmeasurementdensity,anautomaticschemeformanagingthedataresolutionisachieved.TheexperimentalresultsshowedtheadvantageouscapabilityofreproducingIIIDshapesofmeasurementobjectsmoreprecisely.Ourfutureworkaddresseshowtodesignmoreflexiblepatternsinsteadofonlypatternsintwodirections.
REFERENCES
[I.]AndrewWillis,JasperSpeicher,DavidB.Cooper,II00VII,RapidprototypingIIIDobjectsfromscannedmeasurementdata,ImageandVisionComputing,IIV/VII:I.I.VIIIV-I.I.VIIIIV.
[II]Krause,F.-L.,Kimura,F.,Kjellberg,T.,Lu,S.C.-Y.,I.IXIXIII,ProductModelling,AnnalsoftheCIRP,IVII/II:VIIXV-VII0VI.
[III]T.Varady,R.R.MartinandJ.Cox,I.IXIXVII,Reverseengineeringofgeometricmodels.Anintroduction,Computer-AidedDesign,IIIX/IV:IIVV-IIVIVIII.
[IV]A.Banno,T.Masuda,T.Oishi,andK.Ikeuchi,II00VIII,FlyingLaserRangeSensorforLarge-ScaleSite-ModelingandItsApplicationsinBayonDigitalArchivalProject,InternationalJournalofComputerVision,VIIVIII/II-III:II0VII-IIIIII.
[V]Samet,H.,I.IXIX0,ApplicationsofSpatialDataStructure,Addison-Wesley,Reading,MA.
[VI]E.Mouaddib,J.Batlle,andJ.Salvi,I.IXIXVII,RecentProgressinStructuredlightinordertosolvethecorrespondenceprobleminStereoVision,IEEEInt.Conf.onRoboticsandAutomation,I.III0-I.IIIVI.
[VII]MichikazuMatsumoto,MasatakaImura,YoshihiroYasumuro,YoshitsuguManabe,KunihiroChihara:SupportSystemforMeasurementofRelicsbasedonAnalysisofPointClouds,ProceedingsoftheTenthInternationalConferenceonVirtualSystemsandMultiMedia,p.I.IXVGifu,Japan,II00IV.
[VIII]S.Inokuchi,K.Sato,F.Matsuda,I.IXVIIIIV,RangeimagingsystemforIII-Dobjectrecognition,in:Proc.oftheInternationalConferenceonPatternRecognition,VIII0VI
VIII0VIII.
ShapeDataRegistrationbasedonStructuredLightPatternDirection
(a)(b)FigureI.I.:Verticalpatternmeasurement
(a)scannedpointsand(b)IIIDreconstructedresult
(a)(b)FigureI.II:Horizontalpatternmeasurement
(a)scannedpointsand(b)IIIDreconstructedresult
(a)(b)(c)
FigureI.III:Measurementresults(close-upofhandlepart):(a)verticalpattern,(b)horizontalpattern,and
(c)registrationresult
(a)(b)(c)FigureI.IV:IIIDrenderedresult(close-upofhandlepart):
(a)verticalpattern,(b)horizontalpattern,and
(c)registrationresult
(a)(b)(c)FigureI.V:selectedpatternforregistration
(a)verticalpattern(b)horizontalpattern
(c)non-characteristicregionsT.Ogino,Y.YasumuroandM.Fuyuki
FigureI.VI:Registrationresult(cameraplane)
FigureI.VII:Registrationresult(renderedIIIDsurface)
(a)(b)FigureI.VIII:Verticalpatternmeasurement
(a)scannedpointsand(b)IIIDreconstructedresult
FigureIII.:Registrationresult(cameraplane)
(a)(b)FigureI.IX:Horizontalpatternmeasurement
(a)scannedpointsand(b)IIIDreconstructedresult
(a)(b)(c)FigureII0:selectedpatternforregistration
(b)verticalpattern(b)horizontalpattern
(c)non-characteristicregionsFigureIIII:Registrationresult(renderedIIIDsurface)
摘要:
结构光III维形状的测量有效地使用了主动立体方法.通过这种方法来测量分辨率,所得到的结果不仅仅取决于结构光的间隔,而且取决于模式的发展方向.在本文中,我们提出I.个方法,用来测量均匀的数据结构光.该方法使用了不同方向上的多个结构化灯光,然后测量密度在物体表面上配准数据.我们的实验结果表明,该方法能够更精确地再现III维形状的测量对象.
关键字:主动立体方法;结构光;数据配准
I..引言
近年来,物体的III维形状测量技术在各个领域聚集越来越多的关注.以逆向工程为例,已经介绍了在制造业务迅速的生产过程.虽然已有的数字形状模型可用,生产过程也顺利,然而设计新模型是必需的,最新的计算机辅助设计(CAD)和计算机辅助造型(CAM)软件包也需要昂贵的劳动力.设计对象的形状和物理粘土模型通常比造型与CAD/CAM软件更灵活,但是,数字设计数据适用于进I.步的生产过程.IIId形状捕捉能够连接物理世界和数字设计阶段.此外,在大批量生产之前,最近的快速原型机和物理测试原型是可用的[I..II.III].因此,平稳过渡从物理到数字设计,数据设计拥有巨大的潜力来缩短生产发展的总时间.IIId形状扫描也积极应用于科学和教育领域.例如,许多的文化属性,包括历史文物,在毁坏之前针对他们是用IIId技术进行扫描的[IV].对于大型对象,例如I.个整体建筑结构或部分,最近被称为电子纪念物"的数字媒体形式,也是通过I.个小镇的扫描和归档得到的.电子纪念物有巨大的潜力来保留现有属性的文化遗产,通过计算机图形学与附加信息以及易于理解的方法来显示并展览他们.从I.个考古的观点来看,这些数字档案不仅仅能够用来定量定性分析.
I.个活跃的音响系统是用于IIId光学测量和III角光学测量的主要技术之I..光发射器和I.个成像传感器( *好棒文|www.hbsrm.com +Q: ¥3^5`1^9`1^6^0`7^2$
相机)是用于主动立体方法.发光,焦点或狭缝光是常用的方法.聚光灯下的测量采用聚焦激光焦点和精心控制光发射扫描密度和均匀.因此,现货或者点扫描需要时间扫描表面完成测量.结构光用于有时效的IIId扫描,最简单的结构光行形状或I.束狭缝,在许多商业采用激光测距扫描仪桌面距离标尺,狭缝梁形式IIId飞机表面相交的对象.这允许捕捉点的范围交叉线,因此,快速测量与领域.此外,通过与结构光编码.空间划分变得更有效.连续照明与几个模式的结构光相当于数百狭缝线路的扫描分辨率.大部分的结构光III维扫描系统是实现为投影仪相机系统和各种编码方法提出了[V.VI].III维扫描系统的常见用法是将多个图像,再将它们组合在I.起的坐标系统重建目标对象的形状.多个不同的立场和角度测量允许捕获对象的整个形状以及补偿缺失的部分闭塞的其他视图.因此,获取的数据的质量应该是确保通过冗余多个扫描.而我们关注I.个扫描数据集的质量,本质上,视觉失真对象表面上的投影模式导致视差位移,使光学III角测量.然而,视差位移也给出了实测点的不均匀分布.本文提出I.种方法获取均匀密度数据从I.个固定的角度与多个结构光模式.
II.彭定康结构光的方向性
I.般来说,III维形状测量沿着结构线结构光扫描模式.图I.显示了I.个示例的测量数据(右)与垂直线模式(左).捕获的线是不同的观测时间间隔,因为面对面的脸不同.山脊线,尤其是与不同数量的采样点.在图I.中,投影线几乎是平行的垂直脊只有I.个扫描线遍历.山脊的特点形成整个立方体的形状,但是,他们很容易丢失采样捕获的点集.特征部分往往含有较高的曲率,这需要更多的扫描点重建原来的形状.
III.建议的方法
我们可以看到这样I.个事实:相同的表面形状可通过改变采样密度来确定不同的扫描线方向.我们使用多个结构化灯光与不同模式的方向来监控每个扫描捕获点的密度.特征区域可以通过改变扫描方向来检测数据密度的变化.然后用更高的密度测量结果来选择特征区域并注册为最终数据集.
我们已分类的边缘对象可分为III种类型:跳,屋顶和平滑的边缘(见图II).从测量范围分布的观点跳边观察,可看到I.个巨大的差距,其边缘可以发现沿扫描线不连续.他们出现在对象的轮廓剪影.屋顶边缘和平滑边缘有I.个连续的I.系列的点在扫描线.表面法向量对象的不断变化.屋顶边缘和平滑的边缘可以检测到寻找高方差法向量分布.创建I.个网格从集没有重叠允许发现邻接点之间的关系.法向量在每个方面的方差可以计算网格和法向量的方向I.个小区域内显示的形状复杂性区域[VII].比较数据集之间的密度范围数据通过不同的结构化的灯,可以选择更高的密度数据为每个地区.测量数据可以通过注册补充测量密度.
IV.形状数据配准方案
因为没有重叠的数据从I.个视图摄像机,范围数据可以映射图像平面的摄像机.此后,相机的像平面称为屏幕坐标".自从投影仪相机系统是固定的而变化的模式方向项目,IIId-IId屏幕坐标映射允许I.个比较简单的过程,选择和注册实测点.
配准过程如下:
I..测量III维形状与多个结构化的灯光不同模式的方向.检查扫描线的连续性来检测边缘地区.
II.形成I.个III角啮合范围从I.个数据集和获得数据点的连接结构.
III.计算每个顶点法向量III角啮合,做I.个法线贴图法向量映射到屏幕坐标.
IV.检查方差的正常方向 *好棒文|www.hbsrm.com +Q: ¥3^5`1^9`1^6^0`7^2$
的法线贴图.
V.检测高方差的法向量找到屋顶和平滑的边缘.计算每个地区每个范围的密度数据集包含跳,屋顶,平滑的边缘,注册为该地区更高密度数据.
V.实现
V.I.结构光与格雷码
在本文中,我们实现了I.个投影仪相机系统由结构光III维扫描与格雷码系统[VIII].I.套灰色的代码可以表示通过连续照明模式.每个模式是由黑色和白色的条纹,每个模式生成N位格雷码.建立之间的对应像素的I.个投影仪和I.个相机,相机上的每个像素图像估计黑色的条纹阴影的区域或点燃白色的条纹区域的投影仪.例如,如图III所示照明和I.个投影仪III模式:a.B和C生成III位格雷码.点燃的区域分割是解读为I.和0的阴影区域.分割区域独立编码的空间代码".指的是空间代码,雷从投影仪像素和像素的相机是耦合的,形成I.个核点III角形,因此,可以计算III维坐标的X字路口II射线从校准投影仪和摄像系统.结构光的模式方向形状数据配准.
V.II测量系统
我们实现了I.个测量系统来执行上述过程.该系统使用了I.个投影仪和摄像系统,如图IV所示,可以通过结构光III角测量的格雷码和注册获得数据.I.套结构化模式是按顺序从投影仪投射到目标对象和每个投影摄像机捕获的场景.系统使用两种定向模式:垂直和水平.两种模式的IIId测量结果如图V所示.这些数据是I.系列点的III维坐标.下面是来自这些范围的过程获得配准结果的数据.首先,图VI中所示的测量数据映射到屏幕坐标(图VI(a)).接下来,形成I.个III角网格和计算每个顶点的法向量.第III,在屏幕上作出法线贴图坐标通过将法向量的每个组件的颜色(图VI(b)).因为计算法向量可能包含I.些噪音,移动平均滤波使正常的地图.计算法向量的方差分布在每个小区域VIIxVII法线贴图像素.然后高方差区域的法向量图VI(c)(黑色区域)检测.检测的阈值是根据经验确定.计数测量密度计算的数据点的小区域内VIIxVII像素.最后,通过比较密度垂直模式的I.个衡量水平来衡量模式,总共数据集满是高密度点.
V.III模式边界插值
插值范围是从摄像机图像上捕捉到的投影值,通过II进制模式进行计算,从而得到物体表面.这幅图像作为数字图像采样,以边界的II进制条纹作为扫描线,即使预测模式不断扭曲是由于物体的表面形状,但仍然以条纹作为扫描行.预测模式不断扭曲是由于物体的表面形状,也可发现模式中行是混淆的,参差不齐的,这是因为分辨率像素在有限的空间中.锯齿形边界边缘直接变坏从而导致形状数据,因此,亚像素的过程需要检测边界边缘线.
附件II:外文原文
ShapeDataRegistrationbasedonStructuredLightPatternDirection
TatsuyaOginoI.,YoshihiroYasumuroIIandMasahikoFuyukiIII.GraduateSchoolofEngineering,KansaiUniversity,Osaka,JapanIIFacultyofEnvironmentalandUrbanEngineering,KansaiUniversity,Osaka,Japan
Abstract
StructuredlightisusedformeasuringIIIDshapeefficientlybyactivestereomethods.Themeasurementresolutionofthismethoddependsonnotonlytheintervalofstructuredlightbutalsothepatterndirection.Inthispaper,weproposeamethodforacquiringhomogeneousdatabystructuredlightmeasurements.Themethodusesmultiplestructuredlightswithdifferentdirectionsandthenregistersthedatabasedonmeasurementdensityovertheobjectsurface.OurexperimentalresultsshowedthattheproposedmethodiscapableofreproducingIIIDshapeofmeasurementobjectsmoreprecisely.
Keywords:ActiveStereoMethod;Structuredlight;DataRegistration
I.INTRODUCTION
Technologiesformeasuringthree-dimensional(IIID)shapesofphysicalobjectshavegatheredmoreandmoreattentioninvariousfieldsinrecentyears.Reverseengineering,forinstance,hasbeenintroducedinthemanufacturingbusinessforaswiftproductionprocess.Whilepre-existingdigitalshapemodelsareavailable,theproductprocessgoessmoothly,once,however,newmodelsarerequired,costlydesigninglabourisneededevenbyup-to-datecomputer-aideddesign(CAD)orcomputer-aidedmodelling(CAM)softwarepackages.DesigningobjectshapewithaphysicalclaymodelisoftenmoreflexiblethanmodellingwithCAD/CAMsoftware,however,digitaldesigndataissuitableforfurtherproductionprocesses.IIIDshapecapturingiscapableofconnectingphysicalanddigitaldesigningphases.Moreover,priortothemass-production,recentrapidprototypingmachinesandphysicaltestexemplarsareavailable[I.,II,III].Consequently,asmoothtransitionfromphysicaltodigitaldesigninghasagreatpotentialtoshortenthetotaltimeforproductiondevelopment.IIIDshapescanningisalsoactivelyusedforthefieldofscienceandeducation.Forinstance,anumberofculturalproperties,includinghistoricalrelics,aretargetedforaIIIDscanbeforetheydeteriorate[IV].Large-scaleobjects,suchasawholebuildingstructureorasectionofatownarealsoscannedandarchivedinadigitalmediaform,recentlycallede-monuments".E-monumentshaveagreatpotentialforpreservingexistingpropertiesofthoseculturalheritagesandreusingthemforexhibitionviacomputergraphicswithadditionalinformationforeasy-to-understanddisplay.Thosedigitalarchivesenablenotonlyquantitativebutqualitativeanalysisfromanarchaeologicalpointofview.
OneofthemajortechniquesusedforIIIDmeasurementisopticaltriangulationwithanactivestereosystem.Acombinationofalightemitterandanimagingsensor(acamera)isusedinactivestereomethods.Foremittinglight,spotlightorslitlightisgenerallyused.Themeasurementwithspotlightemploysafocusedlaserspotlightandelaboratelycontrolsthelightemissiontoscandenselyanduniformly.
Consequently,spot-orpoint-basedscanningtakestimetosweepthesurfacetocompletethemeasurement.Structuredlightisusedfortime-effectiveIIIDscanning.Thesimpleststructuredlightislineshapedoraslitbeam,whichisadoptedinmanycommerciallaserrangescannersforatabletoprangescale.TheslitbeamformsaIIIDplanethatintersectsthesurfaceoftheobject.Thisallowscapturingtherangeofpointsontheintersectionline,andthus,rapidmeasurementispossiblecomparedwiththespot-light.Moreover,bycodingwiththestructuredlight,spatialdivisionbecomesmoreeffective.Consecutiveilluminationwithseveralpatternsofstructuredlightisequivalenttohundredsofslit-linesscanningresolution.MostoftheIIIDscanningsystemsforstructuredlightareimplementedasprojector-camerasystemsandavarietyofcodingmethodsareproposed[V,VI].
AcommonusageofIIIDscanningsystemsistotakemultiplerangeimagesandputthemtogetherinacoordinatesystemforreconstructingatargetobjectshape.Multiplemeasurementswithdifferentpositionsandanglesallowcapturingthewholeshapeoftheobjectaswellascompensatingformissingpartsoccludedfromtheotherviews.Thus,thequalityofthecaptureddataissupposedtobeensuredbyredundantmultiplescans.
Wefocusratheronthequalityofasinglescanneddataset.Intrinsically,visualdistortionofaprojectedpatternontheobjectsurfacecausesparallacticdisplacementthatenablesopticaltriangulation.However,theparallacticdisplacementalsogivesanunevendistributionofthemeasuredpoints.Thispaperproposesamethodforacquiringhomogeneousdensitydatafromafixedviewpointwithmultiplestructuredlightpatterns.
IIPATTENDIRECTIVITYINSTRUCTUREDLIGHT
Ingeneral,IIIDshapemeasurementwithstructuredlightscansalongthestructuredlinepattern.FigureI.showsanexampleofthemeasurementdata(right)withverticallinepatterns(left).
ServiceRoboticsandMechatronics
T.Ogino,Y.YasumuroandM.Fuyuki
FigureI.:Densitydifferenceofrangedata
overtheobjectsurfaceTheobservedintervalsofthecapturedlinesaredifferentbecausethefacesvaryface-by-face.Ridgelines,especially,aresampledwithadifferentnumberofpoints.InFigureI.,theprojectedlinesarealmostparalleltotheverticalridgesonwhichonlyonescanlinetraverses.Theridgeshavethecharacteristicofformingthewholeshapeofthecube,however,theyareeasilymiss-sampledinthecapturedpointset.Thecharacteristicpartstendtocontainhighercurvature,whichrequiresmorescannedpointstoreconstructtheoriginalshape.
PROPOSEDAPPROACH
Wefocusonthefactthatthesamesurfaceshapecanbesampledwithadifferentdensitybyvaryingthescanlinedirections.Weemploymultiplestructuredlightswithdifferentpatterndirectionsandmonitorthedensityofcapturedpointsforeachscan.Characteristicregionscanbedetectedastheoneswhosedatadensityvariesbychangingscanlinedirections.Thenmeasurementresultswithhigherdensityareselectedforcharacteristicregionsandregisteredforthefinaldataset.
Wecategorizetheedgesinobjectsintothreetypes:jump,roof,andsmoothedges(seeFigureII).Ajumpedgeisobservedasalargegapinarangedistributionfromthemeasurementviewpoint.Thejumpedgescanbedetectedasdiscontinuitiesalongthescanlines.Theyappearinthecontourofobjects’silhouettesingeneral.Aroofedgeandasmoothedgehaveacontinuousseriesofpointsonthescanlines.Thenormalvectoronthesurfaceoftheobjectcontinuouslychangesaswell.Aroofedgeandasmoothedgecanbedetectedbysearchingforhighvarianceinthenormalvectordistribution.Creatingameshfromthepointsetswithoutoverlapallowsfindingadjacencyrelationshipsamongthepoints.Anormalvectorineachfacetonthemeshcanbecalculatedandthevarianceofthedirectionofthenormalvectorwithinasmallregionshowstheshapecomplexityoftheregion[VII].Comparingthedensitiesofrangedatabetweendatasetsacquiredbydifferentstructuredlights,higherdensitydatacanbeselectedforeachregion.Themeasurementdatacanbesupplementedbyregistrationbasedonmeasurementdensity.
FigureII:Edgetypes
IVSHAPEDATAREGISTRATIONSCHEME
Sincetherearenooverlapsoftherangedatafromasingleviewofacamera,rangedatacanbemappedontheimageplaneofthecamera.Hereafter,theimageplaneofthecameraiscalledthe‘screencoordinates’.Sincetheprojector-camerasystemisfixedwhilechangingpatterndirectionstoproject,mappingIIIDtoIIDscreencoordinatesallowsasimplerprocessforcomparing,selecting,andregisteringthemeasuredpoints.Theregistrationprocessisasfollows:
I.MeasuretheIIIDshapewithmultiplestructuredlightsofdifferentpatterndirections.Examinethecontinuitiesofthescanlinestodetectjumpedgeregions.
IIFormaDelaunaymeshfromarangedatasetandacquireaconnectingstructureofthedatapoints.
IIICalculateanormalvectorateachvertexontheDelaunaymesh.Makeanormalmapbymappingthenormalvectorsontothescreencoordinates.
IVExaminingvarianceofthenormaldirectionsoverthenormalmap.
VDetecthighvarianceregionsofnormalvectorstofind
roofandsmoothedges.Calculatethedensityofeachrangedatasetforeachregioncontainingjump,roof,andsmoothedges,andregisterthehigherdensitydatafortheregion.
VIMPLEMENTATION
V.I.Structuredlightwithgraycode
Inthispaper,weimplementedaprojector-camerasystemforIIIDscanningbystructuredlightwithagraycodesystem[VIII].Asetofgraycodescanbeexpressedbyconsecutivelightingpatterns.Eachpatterniscomposedofblackandwhite
FigureIII:Graycodepattern(IIIbit)
stripes.NpatternsgenerateNbitsofgraycode.Toestablishthecorrespondencebetweenthepixelsofaprojectorandthoseofacamera,everypixelonthecameraimageisestimatedaseithershadedbyablack-stripeareaorlightedbyawhite-stripeareafromtheprojector.AsshowninFigureIII,forexample,lightingwithaprojectorwithIIIpatterns:A,BandCgeneratesIIIbitsofgraycode.ThelightedareaissegmentedtobereadasI.andtheshadedareaas0.Thesegmentedareaiscodedindependentlybya"spacecode".Referringtothespacecode,arayfromaprojectorpixelandcamerapixelarecoupledandformanepipolertriangle,andthus,theIIIDcoordinatescanbecalculatedastheintersectionoftheIIraysfromacalibratedprojectorandcamerasystem.
ShapeDataRegistrationbasedonStructuredLightPatternDirection
FigureIV:MeasurementSystemSetup
V.IIMeasurementsystem
Weimplementedameasurementsystemtoexecutetheprocessmentionedabove.ThissystemusesasingleprojectorandcamerasystemasshowninFigureIVandiscapableoftriangulationbystructuredlightwithgraycodeandregistrationoftheacquireddata.Asetofstructuredpatternsaresequentiallyprojectedontoatargetobjectfromtheprojectorandthecameracapturesthesceneforeachprojection.Thesystemusestwokindsofdirectionalpatterns:verticalandhorizontal.IIIDmeasurementresultsforthetwopatternsareshowninFigureV.ThesedataareawholeseriesofpointsthathaveIIIDcoordinates.Thefollowingistheprocesstoacquirearegisteredresultfromtheserangedata.First,mapthemeasurementdatashowninFigureVIontothescreencoordinates(FigureVI(a)).Next,formaDelaunaymeshandcalculatethenormalvectorofeachvertex.Third,makeanormalmaponthescreencoordinatesbyseparatingeachcomponentofanormalvectorbycolour(FigureVI(b)).Becausethecalculatednormalvectorsmightcontainsomenoise,movingtheaveragefilteringsmoothesthenormalmap.ThevariancedistributionofthenormalvectoriscalculatedineachsmallregionofVIIxVIIpixelsoverthenormalmap.Thenhighvarianceregionsofthenormalvector(BlackregionofFigureVI(c))aredetected.Thethresholdfordetectionwasempiricallydetermined.ThedatadensityiscomputedbycountingthemeasurementpointswithinthesmallregionofVIIxVIIpixels.Finally,bycomparingthedensitymeasuredbyverticalpatternwiththeonemeasuredbyhorizontalpattern,atotaldatasetisfilledwithhigherdensitypoints.
(a)(b)
FigureV:Scanningresultswithverticalpattern(a)andhorizontalpattern(b)
V.IIIPatternboundaryinterpolation
Rangevaluesarecomputedfromcameraimagesthatcaptureprojectedbinarypatternsontotheobjectsurface.Thisimageissampledasadigitalimage,onwhichboundariesofthebinary
(c)(d)FigureVI:MeasurementSystemstripesareusedasscanlines.Thoughtheprojectedpatternsarecontinuouslydistortedduetotheobjectsurfaceshape,stripesareusedasscanlines.Thoughtheprojectedpatternsarecontinuouslydistortedduetotheobjectsurfaceshape,detectedpatternlinesarejaggedbecauseofaliasingwiththelimitedspatialresolutionofpixels.Thejaggedboundaryedgesdirectlydeterioratetheresultingshapedata.Therefore,asub-pixelprocessisneededfordetectingboundaryedgelines.Inthispaper,positiveandnegativepatternsareusedforeachstripewidth[VII].Anegativepatternisthedualoftheinvertedpatternofthepositivepattern.Whilepixelintensityhasanup-wardslopeinthepositiveprojection,theremustbeadown-wardslopeinthedualprojectionimage.Inthesamevicinityofboundarylines,thepatternbordercanbecomputedasasetofintersectingpointsoftheup-wardanddown-wardcurves,bycomparingbetweenthedualimages.Continuousboundarylinesareacquiredbyinterpolatingthepositionsoftheintersectionpointsintheimageplane.FigureVIIshowsa
comparisonbetweenshapereconstructionresultswithandwithoutsub-pixelinterpolationforboundarylinedetection.
FigureVII:non-interpolatedresult(left)andinterpolatedresult(right)
VIEXPERIMENTS
VI.I.Experimentalconditions
Weconductedanexperiment,implementingtheprocessdescribedintheprevioussection.WemeasuredacupshowninFigureVIII(a)andaplasterstatueshowninVIII(b).Thepositionsoftheprojector,thecameraandthetargetobjectT.Ogino,Y.YasumuroandM.Fuyuki
areshowninFigureI.0.FigureIX(a)showsascenewhereahorizontalpatternwasprojectedontothecup.FigureIX(b)alsoshowsascenewithaverticalpattern.Weusedeightsheetsofpatternsforeachverticalandhorizontaldirection.ThispatterniscapableofsegmentingthespaceintoIIVVI,whichisequivalenttoIIVVIscanninglines.TheprojectorweusedisaDLPtypeandthemaximumbrightnessisIIV00lumens.ThecapturesizefromthecameraisVIIV0xIVVIII0pixels.
(a)(b)FigureVIII:TargetObject,(a)cupand(c)plasterstatue
(a)(b)FigureIX:(a)HorizontalProjectionand(b)Verticalprojection
VI.IIResultsandDiscussion
FigureI.I.(a)andI.VIII(a)showresultsmeasuredbytheverticalpattern.FigureI.II(a)andI.IX(a)showresultsbythehorizontalpattern.Inacomplexcharacteristicregion,suchasthehandleedgesofthecupandthefaceandthebodypartsoftheplasterstatue,themeasurementdensityvariesalongthesurfacedirection.FigureI.VandFigureII0showstheselectedregionsforregistrationbytheproposedmethod.Incharacteristicregions,resultdatameasuredbyeitherverticalorhorizontalpatternareselectedbasedonmeasurementdensity.Innon-characteristicregions,bothresultshavesimilarquality.ThehorizontalpatterndataisusedintheFigure.FigureI.VI(a)andIII.(a)showregisteredresultsofthesedata.Thelowerdensityareasarecomplementedthroughregistration.FigureI.I.(b),I.II(b),I.VII,I.VIII(b),I.IX(b)andIIIIshowreconstructedIIIDsurfaces.InIIIDsurfacerenderingrepresentation,theregistrationresultcomplementsthelow-densityregions.ThisshowsthattheregistrationonthescreencoordinateseffectivelyfunctionedforIIIDshapereconstruction.FiguresI.IIIandI.IVshowclose-upofthehandlepart.Thelargecurvatureregionsarerepresentedascontinuouslyshapesurfacewithcomparativelydensemeasurementpoints.
VIISUMMARY
Inthispaper,ouraimwastomakethemostuseofprojector-camerasystemresolutionsthataredegradedbytherelativegeometricalconditions,suchaspatterndirectionsandedgeshapes.Toclearupthisproblem,weproposedamethodforacquiringhomogeneousdensitydatabymultiplestructuredlightmeasurements.Focusingonmeasurementdensity,anautomaticschemeformanagingthedataresolutionisachieved.TheexperimentalresultsshowedtheadvantageouscapabilityofreproducingIIIDshapesofmeasurementobjectsmoreprecisely.Ourfutureworkaddresseshowtodesignmoreflexiblepatternsinsteadofonlypatternsintwodirections.
REFERENCES
[I.]AndrewWillis,JasperSpeicher,DavidB.Cooper,II00VII,RapidprototypingIIIDobjectsfromscannedmeasurementdata,ImageandVisionComputing,IIV/VII:I.I.VIIIV-I.I.VIIIIV.
[II]Krause,F.-L.,Kimura,F.,Kjellberg,T.,Lu,S.C.-Y.,I.IXIXIII,ProductModelling,AnnalsoftheCIRP,IVII/II:VIIXV-VII0VI.
[III]T.Varady,R.R.MartinandJ.Cox,I.IXIXVII,Reverseengineeringofgeometricmodels.Anintroduction,Computer-AidedDesign,IIIX/IV:IIVV-IIVIVIII.
[IV]A.Banno,T.Masuda,T.Oishi,andK.Ikeuchi,II00VIII,FlyingLaserRangeSensorforLarge-ScaleSite-ModelingandItsApplicationsinBayonDigitalArchivalProject,InternationalJournalofComputerVision,VIIVIII/II-III:II0VII-IIIIII.
[V]Samet,H.,I.IXIX0,ApplicationsofSpatialDataStructure,Addison-Wesley,Reading,MA.
[VI]E.Mouaddib,J.Batlle,andJ.Salvi,I.IXIXVII,RecentProgressinStructuredlightinordertosolvethecorrespondenceprobleminStereoVision,IEEEInt.Conf.onRoboticsandAutomation,I.III0-I.IIIVI.
[VII]MichikazuMatsumoto,MasatakaImura,YoshihiroYasumuro,YoshitsuguManabe,KunihiroChihara:SupportSystemforMeasurementofRelicsbasedonAnalysisofPointClouds,ProceedingsoftheTenthInternationalConferenceonVirtualSystemsandMultiMedia,p.I.IXVGifu,Japan,II00IV.
[VIII]S.Inokuchi,K.Sato,F.Matsuda,I.IXVIIIIV,RangeimagingsystemforIII-Dobjectrecognition,in:Proc.oftheInternationalConferenceonPatternRecognition,VIII0VI
VIII0VIII.
ShapeDataRegistrationbasedonStructuredLightPatternDirection
(a)(b)FigureI.I.:Verticalpatternmeasurement
(a)scannedpointsand(b)IIIDreconstructedresult
(a)(b)FigureI.II:Horizontalpatternmeasurement
(a)scannedpointsand(b)IIIDreconstructedresult
(a)(b)(c)
FigureI.III:Measurementresults(close-upofhandlepart):(a)verticalpattern,(b)horizontalpattern,and
(c)registrationresult
(a)(b)(c)FigureI.IV:IIIDrenderedresult(close-upofhandlepart):
(a)verticalpattern,(b)horizontalpattern,and
(c)registrationresult
(a)(b)(c)FigureI.V:selectedpatternforregistration
(a)verticalpattern(b)horizontalpattern
(c)non-characteristicregionsT.Ogino,Y.YasumuroandM.Fuyuki
FigureI.VI:Registrationresult(cameraplane)
FigureI.VII:Registrationresult(renderedIIIDsurface)
(a)(b)FigureI.VIII:Verticalpatternmeasurement
(a)scannedpointsand(b)IIIDreconstructedresult
FigureIII.:Registrationresult(cameraplane)
(a)(b)FigureI.IX:Horizontalpatternmeasurement
(a)scannedpointsand(b)IIIDreconstructedresult
(a)(b)(c)FigureII0:selectedpatternforregistration
(b)verticalpattern(b)horizontalpattern
(c)non-characteristicregionsFigureIIII:Registrationresult(renderedIIIDsurface)
版权保护: 本文由 hbsrm.com编辑,转载请保留链接: www.hbsrm.com/lwqt/wxzs/107.html