頭戴式眼動儀之頭動補償探討
dc.contributor | 黃奇武 | zh_TW |
dc.contributor | Chi-Wu Huang | en_US |
dc.contributor.author | 曾士誠 | zh_TW |
dc.contributor.author | Shih-Chen Tseng | en_US |
dc.date.accessioned | 2019-09-03T10:47:03Z | |
dc.date.available | 2020-3-2 | |
dc.date.available | 2019-09-03T10:47:03Z | |
dc.date.issued | 2015 | |
dc.description.abstract | 本研究提出之Rotation可補償2-D Mapping的頭動偏差,還原至Calibration後之預估凝視點(POG),可免去使用者需要頭部固定在支架的困擾。 相關文獻表示,不管是2-D多項式映射或是眼睛3-D Modeling預估方法,其資料大多採用紅外線光源以及所產生之眼角膜反射點特徵,2-D Mapping選用數學多項式計算預估POG,而眼睛3-D Modeling則是找出人眼之視軸方向,視軸與螢幕之交點即為POG。 文獻說明進行預估POG操作前,2-D Mapping需要做簡單的Calibration,請使用者凝視預設的已知點,所得之資料用來計算多項式函數之係數。3-D眼睛模型需要購買昂貴的Stereo-camera,以及取得攝影機系統相關Calibration參數,或是求解眼睛模型之向量,尤其在設定攝影機系統部分,有的使用另一組輔助廣角Stereo-camera,並且搭配3-D掃描儀進行Calibration,相較於2-D Mapping之Calibration步驟,使用者操作會複雜許多。 本研究是使用兩臺PS3攝影機,進而製作一套低於100美元頭戴式眼動儀,軟體部分採用免費的開放原始碼,使用者可以精確地完成目標鍵盤輸入(Eye Controlled Typing)之操作,用於癱瘓人士的溝通是最為廣泛的應用,相較於昂貴的市售眼動儀(成本大於10000美元),本研究眼動儀之精確度可滿足實驗和應用的需求,在硬體成本部分,其優勢顯而易見。 本研究團隊使用自製之頭戴式眼動儀,基於2-D Mapping進行心理學實驗之應用,例如以凝視熱區(Hot-zone)、感興趣區域(Region of Interest)、凝視軌跡(Scan- path),並應用在目標螢幕鍵盤輸入,希望未來研究3-D Modeling之POG預估,有效地應用於實際環境。 | zh_TW |
dc.description.abstract | This paper proposed an approach, by using a 3-D rotation matrix, the errors caused by head movements in 2-D mapping, which mapped the glint-pupil difference vector obtained from the eye image on to a screen for estimating the Point of Gaze (POG), could be kept under a predefined accuracy even the head was moving away from the original calibration position. Hence, it could free the tracker user from uncomfortably confined his head in a chin rest during eye tracking. By the analyze of recent eye tracking techniques, either 2-D polynomial mapping or 3-D modeling basically was tracking the glints of eye images, a bright reflected point of the light source from the eye surface, and the rapidly moving pupil, to find the POG. 2-D mapping used the selected polynomial functions to compute the POG on screen as mentioned above while 3-D modeling is actually measured as well as computed the pupil center and the glint in 3-D position such that the visual axis of the eye could be reconstructed; POG was then found when visual axis was intersecting on a screen or any other subject in the real world Before eye tracking started, a simple calibration procedure was performed in 2-D mapping by using several predefined points on screen to estimate the coefficients of the selected polynomial functions to be used during tracking while in 3-D models, the calibrations are complicated depending on different system configurations, such as Mono-camera measurements, stereo vision measurements. They were also expensive because some models needed additional auxiliary wide angle Stereo-cameras, and 3-D digitizer for system calibration. This approach used two PS3 cameras, one for eye and one for scene, with open source software to construct a low cost (under $100) wearable eye tracker capable of performing eye-controlled typing with quite satisfactory accuracy. Eye-controlled typing is one of the important Human Computer Interface (HCI) applications, especially for disable people. Currently, some commercial wearable eye trackers are available with the price at least over $10,000. The homemade eye tracker in our laboratory was mainly based on 2-D tracking with some self-developed application software, such as Scan-path Trace, Hot-zone Display, Interest-region Search, and Eye-controlled Typing. In addition to modify 2-D mapping with rotation matrix, the 3-D based tracking is planned to be developed and hopefully is capable of working in the real world tracking environment instead of screen only for wider applications. | en_US |
dc.description.sponsorship | 電機工程學系 | zh_TW |
dc.identifier | GN060175025H | |
dc.identifier.uri | http://etds.lib.ntnu.edu.tw/cgi-bin/gs32/gsweb.cgi?o=dstdcdr&s=id=%22GN060175025H%22.&%22.id.& | |
dc.identifier.uri | http://rportal.lib.ntnu.edu.tw:80/handle/20.500.12235/95747 | |
dc.language | 中文 | |
dc.subject | 頭戴式眼動儀 | zh_TW |
dc.subject | 角膜亮點 | zh_TW |
dc.subject | 光軸 | zh_TW |
dc.subject | 視軸 | zh_TW |
dc.subject | 2-D Mapping | zh_TW |
dc.subject | 3-D Modeling | zh_TW |
dc.subject | Wearable eye tracker | en_US |
dc.subject | Glint | en_US |
dc.subject | Optic axis | en_US |
dc.subject | Visual axis | en_US |
dc.subject | 2-D mapping | en_US |
dc.subject | 3-D modeling | en_US |
dc.title | 頭戴式眼動儀之頭動補償探討 | zh_TW |
dc.title | An approach of head Movement Compensation for wearable eye tracker | en_US |