eButton


Despite its small size (diameter 62mm), lightweight (42g), and casual appearance, eButton is a sophisticated wearable computer with a CPU and an array of sensors. Let's find out why eButton is an unusual device with a broad range of future applications.

eButton was developed under a research grant from National Institutes of Health grant U01 HL91736.

What are inside eButton?
eButton contains a powerful CPU running the Linux or Android operating system and other essential components that qualify eButton as a computer. It is equipped with an array of sensors as the computer's peripherals, such as a triaxial gyroscope for measuring body orientation, a triaxial accelerometer for body acceleration, a user-controllable camera for video recording, a GPS sensor for geographical location, altitude and speed, a UV sensor for indoor/outdoor detection, and a thermometer for ambient temperature. With additions of skin-surface sensors, eButton can also measure physiological variables, such as skin conductance, heart rate and respiration rate. Currently, eButton stores data on a miniSD card. We are currently installing additional chips in eButton to enable wireless connection to remote servers for cloud computing and real-time user interaction.
How was eButton built?
eButton was built in our laboratory under a research grant from the NIH Genes and Environment Initiative(GEI)-using advanced electronic design similar to that for the smart cell phone. However, our design must meet additional challenges of making the device small and wearable.
-
How does eButton differ from the smart phone?
You may have realized that, eButton, a small, attractive, convenient, but powerful electronic chest button, is not an ordinary device. It could mark a new domain of wearable computing. Unlike the cell phone which spends most times sleeping, eButton never sleeps-it helps the user all the time. At its chest location, eButton is capable of looking outwards to monitor diet, physical activity, and the living environment, and inwards to the inside of the body to monitor physiological variables, e.g., heart and lung functions. In contrast, health and wellness applications are never priorities in the design of the smart phone.
What are the applications?
There many. Here are some applications. You can imagine yours.
Has eButton been commercialized?
We have filed two patents to protect the intellectual properties and are actively seeking investments to commercialize eButton. Interested investors please contact Janice L. Panza PhD,Technology Licensing Associate, Office of Technology Management, University of Pittsburgh, 412-648-2225, panzajl@pitt.edu
Has eButton been validated?
In addition to various lab validation studies, so far we have conducted three focus group studies to obtain public opinions on eButton designs. We have also evaluated eButton on 18 human subjects with over 1,000 hours of data recorded in real-life with respect to its functions of monitoring diet, physical activity and lifestyle, as well as calculating energy intake and expenditure.
What are the public responses to eButton?
eButton has been highlighted in an article on top scientific journal NATURE and by the NIH director. It has also been reported worldwide.(e.g., a radio interview of Dr. Sun by BBC)

Application Example: Objective Diet and Physical Activity Assessment
Belief or not, in a sharp contrast to the abundant electronic gages that characterize our daily life, we do not have a single electronic device that can measure both diet and physical activity objectively despite their paramount importance in health and wellness. The “ancient” method of self-reporting is still the “gold standard” for weight control. Since an average person does not intentionally memorize or record his/her foods and moves, everyone knows self-reporting is inaccurate. Scientists and engineers are obligated to provide the society with an objective tool for measurement of both diet and physical activity in people’s daily life to help them control weight and keep it off. With the development of eButton, we can now both observe and evaluate diet and physical activity in individual's daily lives, as show in the following summary. (Please click on the pie graph labels to see integrative displays)


Below are the typical recordings of eButton and how eButton does it by computer data processing.
Typical data

Data segmentation and abstraction

Decimation of image sequence
Result of an adaptive threshold blur detection algorithm
Automatic screenshot detection
Video segmentation concepts
Multitouch event categorization

Face detection
Diet Assessment
Food volume estimation
Object references with known camera intrinsic parameters
In this method, the subject places a reference object besides food. Advantage: high accuracy; Disadvantage: inconvenience.
The estimation will be based on the deformation of circular features. Advantage: high accuracy, convenience Disadvantage: not always available
Three or more laser beams are emitted from the electronic device producing references that can be used to calculate food dimensions. Advantage: high accuracy; Disadvantage: high difficulty in engineering implementation.
An LED shines a spot light on the food and the portion size is estimated by the shape and position of the spot light. Advantage: Easy to implement using advanced high-intensity LEDs; Disadvantage: data processing difficulty and low accuracy.
Dinning plate as object reference with known camera intrinsic parameters
Automatic dinning plate detection as a circle reference


Portion size measurement from food pictures
Virtual reality based measurement


Physical Activity Assessment
In sharp contrast to diet, where objective tools are rare, many objective tools are available for PA evaluation, such as indirect calorimetry, doubly labeled water (DLW), pedometers, heart rate meter, and accelerometer. However, diet and physical activity (PA) are both key factors in the etiology of obesity and other chronic diseases, a practical and objective tool to perform joint diet and PA evaluation in free-living individuals does not currently exist. Often, diet and PA are studied separately without considering their correlations. It has been pointed out that this approach segregates and fragments the problem domain, causing concerns about the usefulness of study results. In our approach, we first visually identify the real-life events (including sedentary events) based on the acquired multi-sensor data, and record their durations from the automatically saved time stamps. Next, we search the PA compendium (a database available here) to find the best match and then record the corresponding metabolic equivalent (MET) values. Finally, the caloric expenditure was calculated using an empirical formula based on the MET value and the resting metabolic rate (RMR) which require the gender, age, weight and height of the subject.

Mechanisms of measurement using eButton
Procedure for calculating calorie expenditure of PA. The circle with a cross denotes multiplication
Assuming that the PA compendium and the demographic information are both accurate, PA identification accuracy fully determines calorie expenditure accuracy.

Resting metabolic rate (RMR) based on Mifflin Equation
For Men: RMR = 10* weight (kg) + 6.25 * height (cm) 5 * age (year) + 5
Form Women: RMR = 10* weight (kg) + 6.25 * height (cm) 5* age (year) 161

Physical activity compendium - MET values
Metabolic Equivalent (MET)Values for Activities in American Time Use Survey (ATUS), see detail

Automatic physical activity classification
Testing result in six categories of activities: sitting-up(SU),sitting-still(SS),walking(WK), bowing (BW),crouching(CR) and waist exercise(WE).
Horizontal axis: resolution of motion orientation.
Vertical axis: recognition rate using naive Bayes, K nearest neighbors and support vector machine.
Image features: our multiresolution good feature detection method (MRGF).

Multitouch technology for PA categorization
Publications
  1. Sun M., Burke L. E., Baranowski T., Fernstrom J. D., Zhang H., Chen H. C., Bai Y., Li Y., Li C., Yue Y., Li Z., Nie J., Sclabassi R. J., Mao Z. H., and Jia W., "An exploratory study on a chest-worn computer for evaluation of diet, physical activity and lifestyle," J Healthc Eng, vol. 6, pp. 1-22, 2015
  2. Li Z., Wei Z., Yue Y., Wang H., Jia W., Burke L. E., Baranowski T., and Sun M., "An adaptive Hidden Markov model for activity recognition based on a wearable multi-sensor device," J Med Syst, vol. 39, p. 57, 2015
  3. Li Z., Wang K., Jia W., Chen H. C., Zuo W., Meng D., and Sun M., "Multiview stereo and silhouette fusion via minimizing generalized reprojection error," Image Vis Comput, vol. 33, pp. 1-14, 2015
  4. Li Y., Jia W., Yu T., Mao Z.-h., Zhang H., and Sun M., "A low power, parallel wearable multi-sensor system for human activity evaluation," in Proc. of 41st Annual Northeast Biomedical Engineering Conference, Troy, NY, April 17-19, 2015
  5. Chen H.-C., Jia W., Sun X., Li Z., Li Y., Fernstrom J. D., Burke L. E., Baranowski T., and Sun M., "Saliency-aware food image segmentation for personal dietary assessment using a wearable computer," Meas. Sci. Technol., vol. 26, p. 025702, 2015
  6. Yu H., Zhu J., Wang Y., Jia W., Sun M., and Tang Y., "Obstacle Classification and 3D Measurement in Unstructured Environments Based on ToF Cameras," Sensors (Basel), vol. 14, pp. 10753-10782, 2014
  7. Sun W., Han L., Guo B., Jia W., and Sun M., "A fast color image enhancement algorithm based on max intensity channel," Journal of Modern Optics, vol. 61, pp. 466-477, 2014
  8. Bai Y., "A Wearable Indoor Navigation System for the Blind and Visually Impaired Individuals," PhD Thesis, Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA, 2014
  9. Sun M., Burke L. E., Mao Z. H., Chen Y., Chen H. C., Bai Y., Li Y., Li C., and Jia W., "eButton: a wearable computer for health monitoring and personal assistance," in Proc. of 51st Annual Design Automation Conference, San Francisco, CA, June 01 - 05, 2014, pp. 1-6.
  10. Jia W., Chen H. C., Yue Y., Li Z., Fernstrom J., Bai Y., Li C., and Sun M., "Accuracy of food portion size estimation from digital pictures acquired by a chest-worn camera," Public Health Nutr, vol. 17, pp. 1671-1681, 2014
  11. Li J., Shi W., Deng D., Jia W., and Sun M., "Dense Stereo Matching Method Based on Local Affine Model," J Comput (Taipei), vol. 8, pp. 1696-1703, 2013
  12. Chen H. C., Jia W., Yue Y., Li Z., Sun Y. N., Fernstrom J. D., and Sun M., "Model-based measurement of food portion size for image-based dietary assessment using 3D/2D registration," Meas Sci Technol, vol. 24, 2013
  13. Jia W, Yue Y, Fernstrom JD, Yao N, Sclabassi RJ, Fernstrom MH, Sun M. "Image-based estimation of food volume using circular referents in dietary assessment". Journal of Food Engineering. 2012;109(1):76-86.
  14. Y. Bai, C. Li, Jia W, Li J, Mao Z-H, Sun M. "Designing a wearable computer for lifestyle evaluation". in Proc 38th Annual Northeast Bioengineering Conference; 2012 March 16-18; Philadelphia, PA.
  15. H.-C. Chen, W. Jia, Z. Li, Y.-N. Sun, Sun M. "3D/2D Model-to-Image Registration for Quantitative Dietary Assessment". in Proc 38th Annual Northeast Bioengineering Conference; 2012 March 16-18; Philadelphia.
  16. J. Li, M. Sun, H.-C. Chen, Li Z, Jia W. "Anthropometric Measurements from Multi-View Images". in Proc 38th Annual Northeast Bioengineering Conference; 2012 March 16-18; Philadelphia.
  17. Zhang H, Li L, Jia W, Fernstrom JD, Sclabassi RJ, Mao ZH, Sun M. "Physical activity recognition based on motion in images acquired by a wearable camera". Neurocomputing. 2011;74(12-13):2184-2192. PMCID: 3138674.
  18. Li L, Zhang H, Jia W, Mao Z-H, You Y, Sun M. "Indirect activity recognition using a target-mounted camera". in Prof 4th International Congress on Image and Signal Processing (CISP); 2011 October 15-16; Shanghai, China.pp.487-491.
  19. Li Z, Wei Z, Sclabassi R, Jia W, Sun M. "Blur detection in image sequences recorded by a wearable camera". in Proc IEEE 37th Annual Northeast Biomedical Engineering Conference; 2011 April 1-3; Troy, NY.
  20. Sun M, Sclabassi RJ, Fernstrom JD, Fernstrom MH, Jia W, Method, apparatus and system for food intake and physical activity assessment (Amendment), U.S. Patent filed by the Univeresity of Pittsburgh. 2011.
  21. Li C, Fernstrom JD, Sclabassi RJ, Fernstrom MH, Jia W, Mao Z-H, Sun M. "Food density estimation using Fuzzy Logic Inference". in Proc IEEE 36th Northeast Biomedical Engineering Conference; 2010 March 26-28; New York, NY.
  22. Li L, Zhang H, Jia W, Fernstrom JD, Sclabassi RJ, Sun M. "Recognizing physical activity of a person wearing a video camera". in Proc 32nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society; 2010 Aug 31-Spet 4.
  23. Nie J, Wei Z, Jia W, Li L, Fernstrom JD, Sclabassi RJ, Sun M. "Automatic detection of dining plates for image-based dietary evaluation". in Conf Proc IEEE Eng Med Biol Soc; 2010 August 31 - September 4; Buenos Aires, Argentina.pp.4312-4315.
  24. Sun M, Fernstrom JD, Jia W, Hackworth SA, Yao N, Li Y, Li C, Fernstrom MH, Sclabassi RJ. "A wearable electronic system for objective dietary assessment". J Am Diet Assoc. 2010;110(1):45-47. PMCID: 2813220.
  25. Yao N. "Food dimension estimation from a single image using structured lights". Ph.D. Thesis: Department of Electrical and Computer Engineering, University of Pittsburgh; Advisor:Sun M. 2010.
  26. Yue Y. "Dimensional measurement of objects in single images independent from restrictive camera parameters". Master Thesis: Department of Electrical and Computer Engineering, University of Pittsburgh; Advisor:Sun M. 2010.
  27. Yue Y, Jia W, Fernstrom JD, Sclabassi RJ, Fernstrom MH, Yao N, Sun M. "Food volume estimation using a circular reference in image-based dietary studies". in Proc IEEE 36th Northeast Biomedical Engineering Conference; 2010 March 26-28; New York, NY.
  28. Zhang H, Li L, Jia W, Fernstrom JD, Sclabassi RJ, Sun M. "Recognizing physical activity from ego-motion of a camera". in Conf Proc IEEE Eng Med Biol Soc; 2010 August 31- September 4; Buenos Aires, Argentina.pp.5569-5572.
  29. Zhang H, Li Y, Hackworth SA, Yue Y, Li C, Yan G, Sun M. "The design and realization of a wearable embedded device for dietary and physical activity monitoring". in Proc 3rd International Symposium on Systems and Control in Aeronautics and Astronautics; 2010 June 8-10; Harbin, China.pp.123-126.
  30. Zhang W, Jia W, M. Sun. "Segmentation for efficient browsing of chronical video recorded by a wearable device". in Proc IEEE 36th Northeast Biomedical Engineering Conference; 2010 March 26-28; New York, NY.
  31. Zhang Z. "Food volume estimation from a single image using virtual reality technology". Master Thesis: Department of Electrical and Computer Engineering, University of Pittsburgh; Advisor:Sun M. 2010.
  32. Jia W, Zhao R, Yao N, Fernstrom JD, Fernstrom MH, Sclabassi RJ, M. Sun. "A food portion size measurement system for image-based dietary assessment". in Proc IEEE 35th Northeast Biomedical Engineering Conference; 2009 April 3-5; Cambridge, MA.
  33. Sun M, Fernstrom JD, Jia W, Yao N, Hackworth SA, Liu X, Li C, Liu Q, Li Y, Fernstrom MH, Sclabassi RJ. Assessment of food intake and physical activity: a computational approach. In: C.H.Chen, editor. Handbook of Pattern Recognition and Computer Vision. Hackensack, NJ: World Scientific Publishing; 2009. pp. 667-686.
  34. Sun M, Sclabassi RJ, Fernstrom JD, Fernstrom MH, Method, apparatus and system for food intake and physical activity assessment, application publication No.: 2009/0012433, U.S. Patent filed by the Univeresity of Pittsburgh. 2009.
  35. Sun M, Yao N, Hackworth SA, Yang J, Fernstrom JD, Fernstrom MH, Sclabassi RJ. "A human-centric smart system assisting people in healthy diet and active living". in Proc Int Symp Digital Life Technologies: Human-Centric Smart Living Technology; 2009 May 28-30; Tainan, Taiwan.
  36. Chen M, Dhingra K, Wu W, Yang L, Sukthankar R, Yang J. "PFID: Pittsburgh fast-food image database". in Proc IEEE ICIP; 2009 Nov 7-10; Cairo, Egypt.
  37. Wu W, Yang J. "Fast food recognition from videos of eating for calorie estimation". in Proc IEEE ICME; 2009 June 28-July 3; New York, NY.
  38. Fei J, Yang J, Fan J. "Towards virtually cooking chinese food". in Proceedings of IEEE International Conference on Multimedia & Expo (ICME); 2009 June 28-July 3; New York,NY.
  39. Wang Q, Yang J. "Drinking activity analysis from fast food eating video using generative models". in ACM Int Conf on Multimedia (SIGMM) - Workshop on Cooking and Eating Activity,; 2009.
  40. Sun M, Liu Q, Schmidt K, Yang J, Yao N, Fernstrom JD, Fernstrom MH, DeLany JP, Sclabassi RJ. "Determination of food portion size by image processing". in Conf Proc IEEE Eng Med Biol Soc; 2008; Vancouver, BC.pp.871-874.
  41. Yao N, Sclabassi RJ, Liu Q, Fernstrom JD, Fernstrom MH, Yang J, Sun M. "A sparse representation of physical activity video in the study of obesity". in Proc IEEE International Symposium on Circuit and Systems; 2008 May 18-21; Seattle, WA.pp.2582-2585.
  42. Yao N, Sclabassi RJ, Zhao R, Zhang H, Sun M. "A laser based depth measurement method for digital imaging of close-up objects". in Proc 16th International Conference on Mechanics in Medicine and Biology; 2008 July 22-25; Pittsburgh, PA.
  43. Yao N, Zhao R, Zhang H, Yang J, Fernstrom JD, Fernstrom MH, Sclabassi RJ, Sun M. "A simple laser rangefinder for food dimension measurement". in Proc IEEE 34th Northeast Biomedical Engineering Conference; 2008 April 4-6; Providence, RI.
  44. Zhang H, Zhang K, Mu Y, Yao N, Sclabassi RJ, Sun M. "Weight measurement using image-based pose analysis". Progress in Natural Science. 2008;18(12):1507-1512.
  45. Zhang K, Zhang H, Yao N, Sclabassi RJ, Sun M. "Improved carrying load measurement using video-based gait analysis". in Proc 16th International Conference on Mechanics in Medicine and Biology; 2008 July 22-25; Pittsburgh, PA.
  46. Yang L, Zheng N, Fernstrom JD, Sun M, Yang J. "Automatic dietary assessment from fast food categorization". in Proc IEEE 34th Northeast Biomedical Engineering Conference; 2008 April 4-6; Providence, RI.
  47. Zhang H, Zhang K, Yao N, Sclabassi RJ, Sun M. "Load measurement based on gait analysis". in Proc IEEE 34th Northeast Biomedical Engineering Conference; 2008 April 4-6; Providence, RI.
  48. Greiner S, Yang J. "Privacy protection in an electronic chronicle system". in Proc IEEE 34th Northeast Biomedical Engineering Conference; 2008 April 4-6; Providence, RI.
  49. Yang L, Greiner S, Zheng N, Cheng H, Fernstrom JD, Sclabassi RJ, Sun M, Yang J. "Interactive dietary assessment from video". in Proc 16th International Conference on Mechanics in Medicine and Biology; 2008 July 22-25; Pittsburgh, PA.
  50. Zhang H, Liu Q, Yao N, Sun M. "Carried load measurement based on gait analysis and human kinetics". in Proc IEEE 2008 International Congress on Image and Signal Processing (CISP); 2008 May 27-30; Sanya, China.pp.104-107.
  51. Chen D, Liu Q, Sun M, Yang J. "Mining appearance models directly from compressed video". IEEE Transactions on Multimedia. 2008;10(2):268-276.
  52. Yao N, Sclabass RJ, Liu Q, Fernstrom JD, Fernstrom MH, Sun M. "A video processing approach to the study of over-weight and obesity". in Proc IEEE Int Conf on Multimedia and Expo; 2007 July 2-5; Beijing, China.
  53. Yao N, Sclabassi RJ, Liu Q, Sun M. "A video-based algorithm for food intake estimation in the study of obesity". in Proc of the IEEE 33rd Annual Northeast Biomedical Conference; 2007 March 10-11; Stony Brook, NY.