請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/16199
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 陳炳宇(Bing-Yu Chen) | |
dc.contributor.author | Shu-Yang Lin | en |
dc.contributor.author | 林書漾 | zh_TW |
dc.date.accessioned | 2021-06-07T18:04:45Z | - |
dc.date.copyright | 2012-07-31 | |
dc.date.issued | 2012 | |
dc.date.submitted | 2012-07-27 | |
dc.identifier.citation | Bibliography
[1] Smartphone users around the world - statistics and facts[infographic], 2012. http://www.go-gulf.com/blog/smartphone. [2] F. Ahmad and P. Musilek. A keystroke and pointer control input interface for wear- able computers. In Proceedings of the Fourth Annual IEEE International Conference on Pervasive Computing and Communications, pages 2–11, 2006. [3] P. Antoniac and P. Pulli. Marisil - mobile user interface framework for virtual enter- prise. In 7th International Conference on Concurrent Enterprising, pages 171–180, 2001. [4] D. Ashbrook, P. Baudisch, and S. White. Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring. In Proceedings of the 2011 annual confer- ence on Human factors in computing systems, pages 2043–2046, 2011. [5] D. L. Ashbrook, J. R. Clawson, K. Lyons, T. E. Starner, and N. Patel. Quickdraw: the impact of mobility and on-body placement on device access time. In Proceedings of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, pages 219–222, 2008. 53[6] R. Balakrishnan and K. Hinckley. The role of kinesthetic reference frames in two- handed input performance. In Proceedings of the 12th annual ACM symposium on User interface software and technology, pages 171–178, 1999. [7] J. Bergstrom-Lehtovirta, A. Oulasvirta, and S. Brewster. The effects of walking speed on target acquisition on a touchscreen interface. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, pages 143–146, 2011. [8] A. K. Dey, K. Wac, D. Ferreira, K. Tassini, J.-H. Hong, and J. Ramos. Getting closer: an empirical investigation of the proximity of user to their smart phones. In Proceedings of the 13th international conference on Ubiquitous computing, pages 163–172, 2011. [9] C. Dickie, R. Vertegaal, D. Fono, C. Sohn, D. Chen, D. Cheng, J. S. Shell, and O. Aoudeh. Augmenting and sharing memory with eyeblog. In Proceedings of the the 1st ACM workshop on Continuous archival and retrieval of personal experi- ences, pages 105–109, 2004. [10] X. Han, H. Seki, Y. Kamiya, and M. Hikizu. Wearable handwriting input device using magnetic field. In Proceedings of the SICE Annual Conference 2007, pages 365–368, 2007. [11] C. Harrison and S. E. Hudson. Abracadabra: wireless, high-precision, and unpow- ered finger input for very small mobile devices. In Proceedings of the 22nd annual ACM symposium on User interface software and technology, pages 121–124, 2009. 54[12] C. Harrison, D. Tan, and D. Morris. Skinput: appropriating the body as an input surface. In Proceedings of the 28th international conference on Human factors in computing systems, pages 453–462, 2010. [13] K. Hinckley, R. Pausch, and D. Proffitt. Attention and visual feedback: the biman- ual frame of reference. In Proceedings of the 1997 symposium on Interactive 3D graphics, pages 121–126, 1997. [14] P. Hutterer, M. T. Smith, B. H. Thomas, W. Piekarski, and J. Ankcorn. Lightweight user interfaces for watch based displays. In Proceedings of the Sixth Australasian conference on User interface - Volume 40, pages 89–98, 2005. [15] S. Litchfield. Defining the smartphone, 2010. http://www.allaboutsymbian.com/features/item/Defining the Smartphone.php. [16] M. R. Longo and S. F. Lourenco. On the nature of near space: Effects of tool use and the transition to far space. Neuropsychologia, 44:977–981, 2006. [17] P. Mistry and P. Maes. Sixthsense: a wearable gestural interface. In ACM SIG- GRAPH ASIA 2009 Sketches, pages 11:1–11:1, 2009. [18] T. Mustonen, M. Olkkonen, and J. Hakkinen. Examining mobile phone text legibil- ity while walking. In CHI ’04 extended abstracts on Human factors in computing systems, pages 1243–1246, 2004. [19] C. Narayanaswami, M. T. Raghunath, N. Kamijoh, and T. Inoue. What would you do with 100 mips on your wrist? Technical report, 2011. 55[20] J. Pascoe, N. Ryan, and D. Morse. Using while moving: Hci issues in fieldwork en- vironments. ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction with mobile systems, 7:417–437, 2000. [21] J. Pasquero, S. J. Stobbe, and N. Stonehouse. A haptic wristwatch for eyes-free interactions. In Proceedings of the 2011 annual conference on Human factors in computing systems, pages 3257–3266, 2011. [22] M. T. Raghunath and C. Narayanaswami. User interfaces for applications on a wrist watch. Personal and Ubiquitous Computing, 6:17–30, 2002. [23] M. T. Raghunath and C. Narayanaswami. User interfaces for applications on a wrist watch. Personal and Ubiquitous Computing, 6:17–30, 2002. [24] J. Rekimoto. GestureWrist and gesturePad: Unobtrusive wearable interaction de- vices. In Proceedings of the 5th IEEE International Symposium on Wearable Com- puters, pages 21–27, 2001. [25] G. Rizzolatti, C. Scandolara, M. Matelli, and M. Gentilucci. Afferent properties of periarcuate neurons in macaque monkeys. Behavioural Brain Research, 2:147–163, 1981. [26] R. Ross. Watch: summary data in spatial context. In ACM SIGGRAPH 2003 Sketches & Applications, pages 1–1, 2003. 56[27] H. Sakata and M. Kusunoki. Organization of space perception: neural representa- tion of three-dimensional space in the posterior parietal cortex. Current Opinion in Neurobiology, 2:170–174, 1992. [28] T. S. Saponas, D. S. Tan, D. Morris, R. Balakrishnan, J. Turner, and J. A. Landay. Enabling always-available input with muscle-computer interfaces. In Proceedings of the 22nd annual ACM symposium on User interface software and technology, pages 167–176, 2009. [29] B. Schildbach and E. Rukzio. Investigating selection and reading performance on a mobile phone while walking. In Proceedings of the 12th international conference on Human computer interaction with mobile devices and services, pages 93–102, 2010. [30] A. Sellen, G. Kurtenbach, and W. Buxton. The role of visual and kinesthetic feed- back in the prevention of mode errors. In Proceedings of the IFIP TC13 Third Interational Conference on Human-Computer Interaction, pages 667–673, 1990. [31] E. Tamaki, T. Miyaki, and J. Rekimoto. Brainy hand: an ear-worn hand gesture interaction device. In Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, pages 4255–4260, 2009. [32] D. S. Tan, R. Pausch, J. K. Stefanucci, and D. R. Proffitt. Kinesthetic cues aid spatial memory. In CHI ’02 extended abstracts on Human factors in computing systems, pages 806–807, 2002. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/16199 | - |
dc.description.abstract | 中文摘要
這篇論文提出 SWAP (a Smart Watch for Assisting smart Phones),一個為了輔助智慧型手機而設計的手錶裝置。 SWAP 提供不需要視覺回饋的互動方式,來輔助使用者與其智慧型手機之間的互動。 我們用 SWAP 來展現手機與手錶的合作機制概念,提出智慧型手錶上擁有的功能不宜完全與智慧型手機上擁有的功能完全重複,而應該扮演著輔助智慧型手機的角色。 然而,並非所有智慧型手機上的功能都需要智慧型手錶的幫助。因此,我們進行了一連串的調查與訪談,以回答下列問題: 在現行智慧型手機上,什麼樣的互動與功能是適合移植到手錶上來操作的? 根據調查與訪談的結果,我們歸納出四個指導方針,來描述這些適合的互動與功能: 互動時間短,注意力需求低,精確度需求低,以及即時互動性。 根據這些建議了 SWAP 適合的互動與功能的指導方針,我們設計了一個不需要視覺回饋的互動系統,PUB (Point Upon Body) 互動系統,來讓使用者與 SWAP 互動。PUB 讓使用者藉由觸摸自己的手臂肌膚,並得到觸覺回饋,來操作 SWAP。 我們接著進行了兩項使用者測試以探究使用者與自身手臂互動的能力,並觀察使用者在自己手臂空間中的互動行為。 使用者研究的結果顯示,一般使用者最多可以在自身手臂上,手腕至手肘之間,分辨六個點。 我們觀察使用者行為的結果也指出下列設計方針: 1. 使用者認知的觸碰位置實際上跟真正的觸碰位置並不相同。 2. 每位使用者的觸碰位置都有各自獨特的分布。 3. 使用者的觸碰位置不應該在偵測到後即刻被決定。 4. 每位使用者都各自有最多觸碰個數的能力限制。 5. 離散分布的觸碰點適合操作於階層式的使用者介面。 基於以上從使用者測試觀察到的設計方針,我們實作了 PUB 互動系統。 PUB 互動系統藉由裝附在使用者手腕上的超音波感測器來偵測使用者對於自身手臂的觸碰位置。 最後,我們利用 PUB 互動系統展示了兩個應用,分別是遠端螢幕操作與行動裝置操作。 | zh_TW |
dc.description.abstract | Abstract
In this paper, we present SWAP (a Smart Watch for Assisting smart Phones). SWAP is an wristwatch designed for assisting smartphone users by providing eyes-free interaction. SWAP presents the concept of cooperation mechanism between watches and phones. We propose that smart watch should act as the assisting device for smart phones rather than substitution of them. However, not all functions on smart phones need the assistant from smart watches. Therefore, we conducted a series of field study to answer this question: What interaction on smart phones are suitable to be propagated to smart watches? We then concluded four guidelines to describe the suitable interaction: short interacting time, low attention capacity, low precision interaction and immediateness. Based on these guidelines suggesting the content and suitable interaction of SWAP, we designed the eyes-free interaction system, PUB (Point Upon Body), allowing users to interact with SWAP through the haptic feedback from touching their own arm skin. We conducted two user studies to investigate users' ability when interacting with their forearms and how users behave when operating in their own arm space. According to those results, normal users can divide their arm space at most into 6 points between their wrists and elbows with iterative practice. Experimental results also indicate the following design implications: 1. Mentally perceived positions normally differ from the actual measurements. 2. Users have their own unique tapping patterns. 3. The tapped position should not be confirmed immediately. 4. Individuals have their limitations. 5. Hierarchical UI can benefit from the discrete points. Finally, based on the design principles from the observations, we implemented the PUB interaction system. Two scenarios, remote display control and mobile device control, are demonstrated through the ultrasonic sensing device attached on the users' wrists to detect their tapped positions. | en |
dc.description.provenance | Made available in DSpace on 2021-06-07T18:04:45Z (GMT). No. of bitstreams: 1 ntu-101-R99944015-1.pdf: 24259089 bytes, checksum: 757e8da2d55abaa43c3f28e2b9fd615e (MD5) Previous issue date: 2012 | en |
dc.description.tableofcontents | Contents
致謝 i 摘要 iii Abstract v Chapter 1 Introduction 1 1.1 Motivation 2 1.1.1 Smartphone Uses 2 1.1.2 Smartwatch Position 2 1.2 SWAP 3 1.2.1 Content 4 1.2.2 Interaction System 4 1.2.3 Implementation and Application 5 1.3 Organization 6 Chapter 2 Related Work 7 2.1 Watch Researches 7 vii2.2 Cooperating between Watches and Phones 7 2.3 Interaction with Watch Devices 8 2.4 Body-Sensing Input 9 Chapter 3 Future Watch — SWAP 11 3.1 FIELD STUDY 12 3.1.1 Diary Logging 12 3.1.2 Interview 15 3.1.3 Questionnaire Survey 17 3.2 USER STUDY 18 3.2.1 Participants 19 3.2.2 Apparatus 19 3.2.3 Task and Procedure 20 3.2.4 Results 21 Chapter 4 Interaction System — PUB 24 4.1 Participants and Experiment Conditions 24 4.2 User Study 1: Explore the Division on the Forearm 25 4.2.1 Tasks and Procedures. 26 4.2.2 Hypotheses. 26 4.2.3 Data Processing. 27 4.2.4 Result. 28 viii4.3 User Study 2: Importance of the Feedback from Skin 31 4.3.1 Tasks and Procedures. 31 4.3.2 Hypothesis. 32 4.3.3 Result. 32 4.4 Tapping Behaviors 32 4.5 Design Implications 34 Chapter 5 Implementation and Application 41 5.1 Implementation 41 5.1.1 Visualization 41 5.1.2 Content 41 5.1.3 Interaction System — PUB Implementation 42 5.1.4 Performance. 45 5.2 Applications 45 5.2.1 Mobile Eyes-Free Interaction 45 5.2.2 Remote Display Interaction 46 5.3 Improvement 46 5.3.1 Gesture Recognition 47 Chapter 6 Conclusion and Future Work 51 6.1 Conclusion 51 6.2 Future Work 52 ix Bibliography 53 | |
dc.language.iso | en | |
dc.title | To Watch or Not to Watch?: 為輔助智慧型手機設計之可利用無視覺方式互動之手錶 | zh_TW |
dc.title | To Watch or Not to Watch?: A Smartphone-Assisting Watch with
Eyes-Free Interactions | en |
dc.type | Thesis | |
dc.date.schoolyear | 100-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 陳彥仰(Mike Y. Chen),余能豪(Neng-Hao Yu) | |
dc.subject.keyword | 無視覺回饋,手錶,手機,輔助, | zh_TW |
dc.subject.keyword | eyes-free,watch,phone,assist, | en |
dc.relation.page | 57 | |
dc.rights.note | 未授權 | |
dc.date.accepted | 2012-07-27 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
顯示於系所單位: | 資訊工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-101-1.pdf 目前未授權公開取用 | 23.69 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。