Download as pdf
Download as pdf
You are on page 1of 32
2) United States Patent Dhruwdas (54) SYSTEMS AND METHODS FOR OBTAINING 3-D IMAGES FROM X-RAY INFORMATION (71) Applicant: Indian Institute of Technology Bombay, Mumbai (IN) (72) Inventor: Karade Vikas Dhruwdas, Pune (IN) 0) INDIAN INSTITUTE OF TECHNOLOGY BOMBAY, Mumbsi aN) Assignee (*) Notice: Suibjct to any disclaimer, the tem of this patent is extended or adjusted under 35 USC. 1540) by 0 days. ey Appl. Now 15/685,863 (22) Filed: Jul, 20, 2017 co) Prior Publication Data US 201710323443 Al Nov, 9,2017 Related U.S. Application Data Continuation of application PCT/IN2016/000021, filed on Jan. 20, 2016 @) No. G0) Foreign Application Priority Data Jan. 20,2015 (IN) 199)MUM/2015 (1) nec. Go6K 9/00 Gost 700 Gosr 1100 (2006.01) (201701), (2005.01) (2) GO6T 7/012 (2013.01); GO6K 900208 (2013.01); GU6T 11/006 (2013.01) (Continved) Field of Classification Search CPC sn. AGIB 61057; AGIB 615235; AG1B 6'585: GOT 7/0028; GOST 11/60: GOSK 9/20; ‘G06K 914642; GOTT 1/1648 See application file for complete search history. 8) > B USO10217217B2 (10) Patent No.: (4s) Date of Patent: US 10,217,217 B2 Feb. 26, 2019 66) References Cited US. PATENT DOCUMENTS. 7065.2 B2* 62006 Petro G06K 920 3454420 200200106082 AI $2002. Menhant (Continved) FOREIGN PATENT DOCUMENTS oN tus016 122002 eS wousri7 "3009 (OTHER PUBLICATIONS WIPO, Witen Opinion in corresponding PCT Application PCT. 182016100002, dated Jul. 13,2016. (Continved) Primary Esaminer — Duy M Dang (14) Attorney, Agent, or Firm — Raa Alley 1P on ABSTRACT ‘Methods, hardware, and software tansfonm 2D anatomical X-ray’ images into 3D renderings for surgical preparation. X-ray images ofa body part are identified by camera model A contour is exiricted from the X-ray, Each anatomical region of the contour is assigned 2D anatomical values. separate 3D template forthe body part is modified to match the X-ray image by extracting silhouette vertices From the template and their projections. The template is aligned with the X-ray image and projected on an insge plane to obtain 22D projection model. The template is modified to match the anatomical values by comparing the projection with the corresponding anatomical vale, Rest matching points on the contour for extracted silhouette vertex projections are identified and used 10 back-project corresponding silhouette vertices. The 3D template is defonmed so that ts silhouette Yertices match the target positions, resulting ia a SD recone struction for the Xenay i 26 Claims, 18 Drawing Sheets US 10,217,217 B2 Page 2 (2) Us.cL cre Go6K: 2209055 (2013.01); Goo 3207/1016 (2013.01), GosT 2207/3008 (2013.01); Go67 2210/47 (2013.01) 66) References Cited USS. PATENT DOCUMENTS. 2004000882 AL 1/2004 Hoomeppe ea. a00so10ser9 AL $3008 Wut a DOD6ORKSEHE AL 122006 oes a (OTHER PUBLICATIONS WIPO, Intemational Seach Repor in comssponding PCT Appl ‘tion PCT IN2O16000021, date ul 21,2046 * cited by examiner U.S. Patent Feb. 26, 2019 Sheet 1 of 18 US 10,217,217 B2 12 14 ( ( Inputter Importer |—} contourer FU 16 BIM cM SV 1 ? S | eS Sw Ni | ‘ | ape MLP SA | estimator estimator modifier | 22 24 _ FIG. 1 U.S. Patent Feb. 26,2019 Sheet 2 of 18 US 10,217,217 B2 FIG. 2 U.S. Patent Feb. 26, 2019 Sheet 3 of 18 US 10,217,217 B2 Femoral neck ~~) Femoral trochanter ————' Femoral ball Femoral Shaft Femoral-Medial Femoral-Lateral — condyle condyle Tibial Proximal al condyle Tibial Shaft Tibial Distal condyle FIG. 3A U.S. Patent Feb. 26, 2019 Sheet 4 of 18 US 10,217,217 B2 Femoral Neck Axis* ———__ Femoral ball landmark Femoral Neck angle* ne Length of Femoral Mechanical axis* Femoral Shaft Axis” Femoral Medial-Lateral condylar width* Femoral Medial Femoral Lateral condylar landmark ‘condylar landmark Femoral Distal-Medial Femoral Distal-lateral condylar landmark condylar landmark Tibial Proximal-Medial Tibial Proximal-Lateral condylar landmark condylar landmark Tibial Medial-Lateral condylar width* Tibial Shaft Axi Tibial Distal-Medial condylar landmark Tibial Distal-Lateral condyiar landmark * pnatorveal Parameters U.S. Patent Feb. 26, 2019 Sheet 5 of 18 US 10,217,217 B2 FIG. 3C ~ Femoral neck. Femoral — Femoral trochanter ball _— Femoral shaft Femoral lateral condyle —— ___ Femoral medial 2 condyle Tibial proximal —Cong condyle — Tibial : shaft — Tibial distal condyle i U.S. Patent Feb. 26, 2019 Sheet 6 of 18 US 10,217,217 B2 FIG. 3D Femoral Neck Femoral ball landmark axis™ Femoral Neck angle* Shaft-neck Femoral Mechanical axis* ‘landmark | Femoral shaft axis* Femoral Medial-Lateral condylar width* Femoral Lateral Femoral Medial condylar landmark | I condylar landmark Femoral Distal-Medial Femoral Distal-Lateral condylar landmark condylar fran Tibial Proximal-Lateral condylar landmark Tibial Proximal-Medial condylar landmark | i ‘Tibial Shaft Axis* | Mbial Media-Lateral i Joondylar width * 4 Tibial Distal-Lateral i ; Tibial Distal-Medial ‘ondylar landmark ——__J | condylar landmark * Anatomical Parameters U.S. Patent Feb. 26, 2019 Sheet 7 of 18 US 10,217,217 B2 FIG. 4 a ll Template 2D meshing Constrained 20. Identifying Extracting Categorized projection points meshing boundary edges boundary edges projection contour U.S. Patent Feb. 26, 2019 Sheet 8 of 18 US 10,217,217 B2 FIG. 5 208 205 U.S. Patent Feb. 26, 2019 Sheet 9 of 18 US 10,217,217 B2 FIG. 6 U.S. Patent Feb. 26, 2019 Sheet 10 of 18 US 10,217,217 B2 FIG. 7 U.S. Patent Feb. 26, 2019 Sheet 11 of 18 US 10,217,217 B2 FIG. 8 404 401 402 405 US 10,217,217 B2 Sheet 12 of 18 Feb. 26, 2019 U.S. Patent zaus snowos nau! ppojeas pue pautye noauoo uopaofoud ,soyejduia, wonsoforg wl > on vwoneio1 s9ye uoRDaford Tass sixe yeys ezowas inoge seeder gE Jo woneiou so 6 ‘Sls wie A oh vonselord sovejdures jer, sxe yeys US 10,217,217 B2 Sheet 13 of 18 Feb. 26, 2019 U.S. Patent es ee eee HU eee Ol ‘Sls US 10,217,217 B2 Sheet 14 of 18 Feb. 26, 2019 U.S. Patent TrapDoa [asi] woneuojea WSs UeDe|TeT Usn Bip BUTT ston sod yodse1 Buypuodsass09 i2y: Jo suopsod ata analy ifm sa0pu9n atIaNOUIS AYA EWA Yons aeIdUer gE aK WiO}eG 1 [pouspuodsanes GE-ae Nawen SHanoWis W560 9 UORISOM j95ue) WaTIOA SHSTOUIS ujpuodsasso> ay) 0} s980p 51 yey) Aes pareloud ypeq a4 Uo WoRISod e puy pue Od BuyREW aq WEE HalOsd-yPeR I Ty HoyamojT) onbjuyses WOWUNTE = aur Buen Topow eroues won OF Suypiooe) aed eum (6 101A mo}deD Indus odes sere ah (E02 pea BoB UE SETH ATO (sinoquo> anduy suoyfas jequeneue ous) t Zim 9009 aif Jo WOREIUD TO UMOUTUN pue Sayre OU WIM) 9U0G eo [dy 20 Tw) soBeuH AerK Ue Hod a) | 6s as ts 9s ss vs es US 10,217,217 B2 Sheet 15 of 18 Feb. 26, 2019 U.S. Patent Vel “Sls HL “Old Jo LES-ZS"aBeuu Aer-X gy 24} 0} Wadsal ym vononaysuoses ag wioyied pue aeiduiay GE e Se japow payonysuodas aE ay as. Th Did Jo poypaw uaals yp Sulsn Sdew) WW. ‘ay 03 32adsad YIM UORINASUODaY GE WUO}ad (adeui] FM Bu0q au JO UOREIUELIO UMOUAUN pue ‘suaysew OU yrim) aauy Jo afew) Aes-X dy Wodu US 10,217,217 B2 Sheet 16 of 18 Feb. 26, 2019 U.S. Patent deb “Sls ‘auosoun 9104 ys 57 UD (5y 0 1) ABEL 24 0 Spiral BELL fo UPAR weALIORe uOREWHOJEG YSOWN UEIDeIdET ‘uani8 ou Buysn suopisod 128e3 Buipuodsais0D sleya analyse lium saopua, axanoulis oun eu Yons areldwion gE at WwO}G Anuapuadspur Anuapuadspuy ‘aBewy Aesx dveun 04 woadseu yum apeidusey ‘aeuy ers WWeWy 03 wadseu uM ade, 2un jo (oBew Arox gy 124m ,pouBie siUaqyM areydwran 3p jo (eu ADC TW OLUM, poUye 5] yy aaeyduian ayy wou powenxaseoruen enanoups oy 40 suon|sod ‘ayy Woy payennxa) seaqJan aNanoyHs aya jo SuORIsod yer 3931} "D1A $0 OFS _oxdn wuopag ye 12BOr_ LL "Did JO OLS _cudn wuopag US 10,217,217 B2 Sheet 17 of 18 Feb. 26, 2019 U.S. Patent [anoqo> Indul a4} 0} anaywOD UORDaTONd BIA Ay 358q 0} poyjaui paseq do} Buisn ‘wleBe auejda¥ew! 4p 0} jeuoU sixe ayn inoge ayedwway au BYeIOY 7 zdais [[porectonarnien cont oa bom aman an piamoadonjomonc mnonemey —|— “THO|UOD Yaduy doy WOH pareyrojes sem VeWy ORBIT NEU ‘98uey> im Y>qum) uonddfoud atp wos pareina[e> 5 1S IY pu 4 woNDBHIDYI 1} Toneiorsia qm yy ABM @ Ypns Uy ‘aue|d ews, Ue ynoge awe idway ay) aIeIOY T iT 3u0q oui Jo UO|sueWip sofew Aue SHEF—yeN2,e9 ‘sno UoD ynduy ayy WO} ‘Sued aBeur Aerx anda, ao uo suonoafoud say) areynojea pue arejdulay qgay) wou 52911190 yiewpue} ZuIpuodsaiu09 ay) 12e%9 t [ (caseane) sew puel a\qeyAUEp! Jo SuONIsod eNNe “aBeu Aerx INdul at Wor t [Frammintaro panes an pea im vomeTnTa jo pORID len NTE ATER “THOWLOD Friday Sip 10} Pareynayeo Sony feduLd Sip GaIeU [ym UONDATOIG Sy 105 PaveHMO|ES axy JedPoulid 1eU) Ypns ‘aue|d afew) atp 0} /eUOU 51 YDIUM sive ays rnoge aveidiway auOg sup aIeI0N “Cav 20 TW) aveld aBeu| indupi oF wadso, yap UoRIsod Aresygze ve 5] avelduIey Ge SUL eb ‘Old US 10,217,217 B2 Sheet 18 of 18 Feb. 26, 2019 U.S. Patent ‘WiAHORE [aS wonewTojeG yon UePeydey vane a suojsod ya8se1 Buypuodsa1109 sj Jo suo}Is0d aya axaIUDE prep one ma ions eidua a 24 wejeq (eauapuodsasio> Ge-Ge -euisn stionoulis yae0 jo WOp|sod Josey) xaziOA aTIINOUIE ad uipuodsauioo ays oa sasop 51 ey) Aes parzafoud y>eq 34 Uo UoRsod & puy puE aujod SuyprEW asaq y>ea ;29f0ud-90eR ] ots [powaw [Wos) sdew Bujzjue8s0 ys Bujsn anoauos induj ay oqo InoquoD UoRDefoMd siejauer Bundepe Aq suapuodsaica] 8S {2-d2) sanaquao anduy 24) 4309 vo aod Sunyorew 3694 ay Puy UIOd sNeWKOD UOR29[01d ayeidula Y2e2 404 t soejdurey axa jo 0501p oso, us 219 wou parses ssayauuesed je>ywoyeue Jo sanjen GE aif Y>reW 03 ayejdulBy aXA Jo VoR-eDY)POU jaqwoeUE aanD3}e5 as USGS TOPOTT SIIPINA SOROS BIST TV IO sie vs TGuoneyiaos auiswos8 GF) saBe Grog WOH; SOTTEA GE roused fEIUOIEGE BUA Jo HIER GE ETO (sejfue pue syifue) svorowered jenworeve pare A gz pue Sjewpue et as {ssnoquoo ynduy) suo(Bos yeojwoyeue ounsip yy usqueYpew BypnodUcD Bulsn eBew) Aes-x 1A UI ‘Hew Aes K uses 705 OTe Lvone>yjuBews jeySip pue (1a\R0 yee o1 yoadsou yaym saBlewl| Om; aYp ue BouN0s Jo UoRISOd) japow ower ain aud | zis [ Sa FOOT TVS TN TO STE OTT ie vb ‘Old US 10,217,217 B2 1 SYSTEMS AND METHODS FOR OBTAINING “3D IMAGES FROM X-RAY INFORMATION RELATED APPLICATIONS ‘This application claims priority under 95 US.C.§ 120-6, ‘and is a continuation of, co-pending Intemational Appl tion PCT/IN20161000021, filed Jan. 20, 2016 and designat- ing the US, which claims priority to Indian. Application 199/MUM/3015, file Tan. 20, 2015, such Indian Appl tion also being claimed priority to under 35 USC. § 119. ‘These Indian and Inernational applications are incorporated by reference herein in their entireties. BACKGROUND ‘Surgical planning isa preoperative method of visuaising ‘2 surgical inervention, to set out the surgical steps and bone Segment navigation in the context of computer assisted surgery. Surgical planning is important in orthopedic sur- gery, nourosuryery, oral and maxillofacial surgery, ete Execution, or transfer ofthe surgical planning tothe patient, is generally performed with a medical navigation system. Some orthopedic sureries, like knee or hip replace include cutting or dling on an irepular-shaped a bone Performance and aecuracy of such surgeries improves ifthe surgery is planned pre-operatvely: Surgeons are tained 10 tase conventional 2D image data to prepare for their complex procedures. Such planning may be made from X-ray images DOFCT data sets or the ike. CT data sels are lange compared tw Xray images. Hard copies of X-ray images of the particular region of the patient's body Tor operation, such as ‘knee or hip-joint, or digital X-ray images on a PC based, ‘en be used for 2D operational planning. SUMMARY, Example embodiments inchade computer systems for transforming 2D anatomical X-ray images into 8D render- jngs for surgical preparation through example methods Such methods inclde aking x-ray image of body part to be converted to 3D and detemnining a camera model of the X-ray image. For example, spatial valves ofthe X-ray source and body part may indieaie the eamera model. A contour of the body partis extracted! from the X-ray and analyzed based fon its anatomical regions. Fach region is assigned 2D) ‘anatomical values inthe entour. A separate 3 template for the body partis then modified to match the 2D Xray images by oxtricing silhonette vertices from the 3D template and their projections, according to the camera model and how those features ae initially aligned in the template. The template can then be aligned with the x-ray. image and projected on an image plane for the appropriate camera ‘model to obtain a 2D projection model. The template i thea ‘modified to mateh the 20 anatomical values bY comparing the 2D projection with the corresponding identified anato ‘al values. A best matching point onthe contour, for each ‘exiracied silhouette vertex projection, is identified between the 2D projetion and contour. The resulting matching points are then back projected based on camera model to fom & back projected ray with target postions that are elosest to a ‘corresponding silhouette verte, The 3D template ean thea be deforma! so that its sifhonette vertices mateh the target positions, resulting in a 3D image that comesponds to the 2D Xoray image, BRIPP DESCRIPTIONS OF THE DRAWINGS Example embodiments will become more apparent by describing, in detail, the atached drawings, wherein like 0 o 2 elements are represented by like reference numerals, whieh ‘are given by way ofillustation oaly ad ths do aot iit the ‘example embodiments herein, FIG. 1 isan illustration ofa schematic block diagram of ‘example embodiment system. FIG. 2is an illustration of a camera model source posi- sioning. FIG. 34 isan illustration of anatomical regions for femur and bia FIG. 3B isan illustration of anatomical landmarks and the ‘anatomical parameters for Femur and tibia, FIG. 3C is an illistration of anatomical regions corre sponding to the repions distinguished in the contour of the Xray image, FIG. 4D is an illustration of anatomical landmarks iden- tified Based on anatomical regions. FIG. 4 is an illustration of triangulation of projected points, meshing after puting constraints andthe otter con- ‘our calulaton, FIG. 5 isan illustration of femur and tibia images wherein with corresponding transformations tothe template PIG, 6 an illustration ofthe template model before and after the alignment. FIG. 7 is an illustration of template deformation. FIG. 8 is anillusteation of deformation for local matching PIG.9 isan lusintion of extraction of separate boundary ‘contours for bone shai, from an MI. view x-ray image. FIG. 10 is an illustration of template alignment with respoet to Medial-Lateral image. FIG. 1 isa flowchart of an example method of 3D image reconstruction Irom a single X-ray image. PIG. 124 is a flowchast of an example method of 3D) ‘mage reconstruction and template deformation separately with respect to ML aad then AP x-ray image. FIG. 128 is. flowchart of an example method of the 3D. mage reconsiuction and template deformation simultane- ously with respect io ML and then AP x-ray image. FIG. 13 is a flowehart of an example method of deter- ‘mining alignment of the template with respect to the input ray image FIG. 14 isa Howchar of an example method of 3D image econsimieton from a two Orthogonal X-ray image, DETAILED DESCRIPTION Because this is a patent document, general broad rales of construction should be applied when reading it. Everything eseribed and shown in this document is an example of subject matter falling within the scope of the claims, appended below. Any specific structural and funetionsi etals disclosed herein are merely for purposes of describ- jing how to make and use examples. Several differnt ‘embodiments and methods not specifically disclosed hereia ‘may fall within the claim scope: as such, the elaims may be embodied in many alternate forms and should not be eon- Strued as limited to only examples set forth herein. Tpwill be understood that, although the terms first, second te. may be used herein to describe various elements, these elements should not be limited (o aay order by these tems These terms are used only to distinguish one element rom ‘another, where there are “second” or higher oxdinals, there ‘merely must be that many number of elements, without necessarily any difference or other relationship. For ‘example, first element could be termed a second clement, and, similerly, a second clement could be termed a first clement, without departing from the scope of example ‘embodiments or methods, As used hereia, the term “andor” US 10,217,217 B2 includes all combi listed items. The use of “etc.” indicates the inclusion ofall other elements belonging to the same group of the preceding items, in any “and/or” combi- nations) twill he understood that when an element is referred to as being “connected.” “coupled,” “mated,” “attached, “fixed,” ete. 1o another clement, it ean he directly connected to the other clement, or intervening elements may be pres- cent, In contrast, when an element is refered to as being “direlly connected.” “direedy coupled,” ete. to another ‘element, there are no intervening elements present. Other ‘words used t0 describe the relationship between elements should be interpreted ina Tike fashion (eg, “between” versus “direetly benveen,” “adjacent” versus “directly adja- cent,” ete), Similarly, & term such as “communicatively ‘connected includes al variations of information exchange and routing between two electronic devices, including inter mediary devices, networks etc, connected wieclesslyor not ‘As used herein, the singular formsa,"“an," and “the” are jntended to include both the singular and plural forms, unless the language explicitly indieates otherwise. It will be further understood that the terms “eomprises.” “comprise jing.” “includes,” andlor “ineluding.” when used herein, speci the presence of sate featres, characteristics, step ‘operations, elements, and/or components, but do not them solves preclude the presence or addition of one or more other Teatures, characteristics, steps, operations, elements, com- ponents, andor groups thereof. ‘As used herein, "3D" means 3dimensional, while "2D" means 2-dimensional. The stnictures and operations dise ‘cussed below may occur out of the order described and/or noted in the Figures. For example, two operations andlor figures shown in succession may in fact be executed con ‘currently or may sometimes be executed inthe reverse order. ‘depending upon the funetionlity/aets involved. Similarly, ‘individual operations within example methods deseribed below may be executed repetitively, individually or sequen- tally, to provide looping or other series of operations aside from single operations deseribed below. It should be pre- sumed that any embodiment or method having features and funetionality described below, in any workable combination, {alls within the scope of example embodiments The inventors have recognized that even well-trained surgical planners can sieuggle with limited information that js available in 2D surgical planning and/or without tying multiple approaches in planning prior to the operation. 3D Viral surgical planning may aid in determining the best plan and transferring itt reality. Particularly, surgery plan- ning in a 3D view may be more accurate, realistic, andior satisfying (lo a surgeon as well as patient) a5 compared to a ‘conventional process of 2D view-based planning. 30 plan- ning, however, requires readering of a 3D image Irom available data. The Inventors have recognized that X-ray ‘mages may he used for SD reconstniction so that compte tational devices like mobiles phones or tablet computers, Which have relatively lesser computational prowess, can also he used for the reconstruction process. Portability provided by such devices allows for greater Hexbiity in @ hnaltheare environment. Hard copies of X-ray images of the region of the patient's body for operation, however, may n01 allow a surgeon to simulate post-operative cantons andor may be an inconvenient way t© peeform measurements Morsover, digital X-rays only provide 2D visualization of ‘internal bone/joint anatomy and hence do not give accurate view: orientations, simulations, andior feeling of surgery of | 3D environment, o 4 A3D surgical planning environment with 3D bone shapes may require a 3D virtual model of the bone. While such 3D models may be derived {rom CT seans of the bone anatomy cof a pation, CT scans involve health risk, cast, and time, ‘sch that medical professionals may not prefer to perform surgery planning using CT scans. Morcover, 3D model reconstrictions from CT scans are difficult on_ portable ‘mobile devices, due to data size and computational reguie- ‘meats. Conversion of CT data to a 3D model is anyway time-consuming and requires significant manval inputs ‘Transfering CT scan data over the internevaetwork for various applications like teleradiology, collaborative diag- nosis, sharing, and saving a diagnosis or surgery planning, clovad-based medical applications bosed on SD visualization of patients’ anatomy may further be burdensome. The Inventors have newly recognized that conversion of 2D X-ray images into 3D models may solve the above and ‘lher problems. Converting 2D X-ray images into 3D mod- cls may be computationally heavy’ andlor require X-ray images to be input ina Way reguring a radiologist oF surgeon to take extra care and/or use a special imaging ‘device ora calibration device. In adition to the advantages ‘of 3D surgical planning, 3D images/models ofthe hone can also be used for printing the bones into plastic models for informing patents about the surgery andlor taining and -walmodel-based surgery planning. 3D models of bones can also be used for printing patient-specific instrumentation used in orthopedic surgeries. Use of 2D X-rays for 3D ‘modelling does not require a patient to go under the health risk or expense of CT scanning, 2D imaging data is further much smaller and sch more easily transfered tha CT scan data for transfer to an instmimentation manufacturer, ‘Ths, to overcome these newly-recognized problems as well as others and achieve these advantages, the inventors have developed example embodiments and methods described below to address these and other problems recognized by the Inventors with unigue solutions enabled by example embodiments "The present invention is devices, software as stored or executed on tangible computer-readable media, and methods {or converting 2D) X-rays into fll 3D pre-operation plane ring models. In contast to the present invention, the few example embodiments and example methods discussed below illuseate just a subset of the variety of different configurations that ean be used as and/or in coangction With ‘the present invention "PIG. 1 is an illustration of a block diagram of an example ‘embodiment system 1 useable to obtaining 3D images using conventional 2D X-ray images. For example, 3D models of bones may be generated from one or two 2D Xeray imaged rridiographs. Example embodiment system 1 is processor based, and actions of system I—and where example embodiment system 1 executes example methods are ‘dependent upon the processors) being specially-configursd Tor the same. As shown in FIG. 1, an Xray inputter 12 provides X-ray images for conversion. Inputter 12. may foguie the Xoray inages through known procedures With conventional single-view X-ray imaging equipment. Orthogonal X-ray images from biplanar imaging may also be used. Such Xcray images from inpotter 12 may inelude ‘mdia-lateral and anteiorposterior views. The X-ray ‘mages may not have any markers andlor have any known orientation with respect fo the bone. Alternatively, oF additionally, a data importer 14 may import a patient's X-ray image(s) in digital format. For ample, importer 14 may be a scanner configured 10 convert X-rays in hard copy format to a digitized format. US 10,217,217 B2 5 This digitization may be done simply by vsing a camera, a X-ray digitizer, andlor an X-ray film scanner that converts the X-ray into digital format, suelt as any of the formats seloeted from JPG TIF/PNG of DICOM format a the like ‘The X-ray images imported ean belong to media-tateral (ML) view or anteriorposterior (AP) view or both. Such imported images, may be processed for 3D reconstruction as final Xemy images in # digital format For 2D-10-3D coaversion, camera model detertinator 18b may detect whether an X-ray image is ML or AP, using known parameters. As shown in FIG. 2, image plane 101 is ‘a plane in a 3D imaging space that corresponds to detector plane 101, a plane coinciding with the flat Xeray sensor pane] or film of the real imaging environment, where the projection of the bodylobject/bone is formes. Image center 4102 is the central position of a rectangular detector. For ‘example, image center 102 may be the normal positon on, Jmage plane 101, which coincides with the X-ray source suel as an Xoray sensor panel or a film is as placed during the imaging. "The determined eamers mode! is used for SD reconstrac= tion to mimic the real X-ray imaging environment and includes the following: position af X-ray source 104, suchas 4 point source corresponding t0 real X-ray souree of the ‘maging equipment, with respect to image plane 101 in the maging space; and te distance 108 between centroid 106 of fan object such as bone $0 and the X-ray souree 104, ‘measured in the direction normal 107 o image plane 101 ia the imaging space "As shovsn in FIG. 2, forthe camera model a position of source IM with respect to image center 102, source film distance (SFD) 108, source object distance (SOD) 103 is, defined. Position of the X-ray source 104 with respect {0 ‘image center 102 is determined so that a normal of image plane 101 arising from image center 102 will coincide with Source 104 ata known distance called source film distance 105 from image center 102. Typically, SFD 105 is equal to the distance between an X-ray’ source 104 and the detector, ‘measured along the direction tht is nomal 107 ta detector plane 101 Source object distance 103 may be defined asthe distance between X-ray source 104 andl hone-centroid 106, which is the average position of all the surface points of bone 50. measured slong dreetion normal 107 to image plane 101. ‘camera calibration perspective ratio K may be defined a & ratio of SOD 108 to SED 108. SOD 103 may either be & Known parameter or may be approximated. An example method to determine SOD 108 approximately is disclosed as below. ‘Aspherical hall markor with a known actual diameter (for ‘example, 25 mm) is placed near the object (bone $0/bedy) ‘during X-ray imaging, closer to image center 102, ata height from detector plane 101, that is closer to the height of centroid 106 from detector plane 101, by eyeballing. SOD 103 will be equal to multiplication of SPD 108 and the ratio ‘of the known actual diameter ofthe spherical ball marker to the diameter of the circular‘liptical projection of the spherical ball marker on detector plane 104. The diameter of the cireulaneliptical projection of the spherical ball marker ‘on detector plane 101 is equal to the diameter of the ‘iccularelliptical projection of the spherical ball marker ‘measured on the final X-ray image multiplied by the digital ‘magnification ratio (given below), ‘A digital magnification rato deteminator for an X-ray ‘mage (ML. or AP) may be included in example embod ‘ments, The digital magnification ratio is the ratio ofthe vale ‘ofthe distance between the postions ofthe projections of 0 o 6 ay {Wo points on the object's surface on detector plane 101 to the value of the distance between the corresponding points as measured in the final X-ray image, which may be ‘measured in tems of pixels or mm. This ratio can be a ‘known parameter, or an example method for determining the digital magnification ratio for an X-ray image may be wed ‘wherein 2 circular coin marker with known actual diameter is placed on the detector while taking the X-ray image. The igital magnifiation ratio willbe approximately equal tothe rio of the known aetual diameter of the circlar coin to ‘ameter of the coin as visible on the final X-ray image, as measured in terms of number of pixels or mm, AI the positions determined on the final X-ray image, in terms of X ‘and coordinates (eg. in pels) may be multiplied with the igital magnification ratio before processing for 3D recon- struction, Ths includes contonr points and landmarks As shown in FIG. 1, example embodiment system 1 may include a contourer 16 that defines contours of a bone oF other object in an uploaded or imported X-ray. The contour ‘of bone isa curve consisting of set of 2D points on the final ‘X-ray image which corresponds o the outer boundary ofthe bone that is visible on the final X-ray image. Contourer 16 may allow a user to draw an outer boundary of the bone ‘anatomy of interest Typically. a user draws the outer bound- ary of the hone anatomy of interest, depending on the surzery. For example, a femur and tibia bone for knee replacement or tibial osteotomy surgery may be outline. Automated pre-defined contouring may be used to pre-empt ‘sontating fines and assist the user in relatively more precise contouring, Brightness and/or contrast ofthe X-ray image ‘may be adjusted so that the boundary of bone anatomy’ is easily distinguishable. ‘Contourer 16 may provide an inital contour foreach bone that can be boundary of the projection of the template ‘ocording tothe calculated camera model, Since the vertices fof the template will be divided and labelled as distinct regions, the projected initial contour will also have the

You might also like