EFFICIENT TRAINING FOR MACHINE LEARNING MODELS ON EMBEDDED AND/OR LOW-POWER DEVICES

Similar documents
ALTERNATIVE CHARGE CONTROL SYSTEM FOR MERCHANDISE DISPLAY SECURITY SYSTEM

SYSTEM AND METHOD FOR SPEECH RECOGNITION

REDUCING GRANULARITY OF BROWSER FINGERPRINTING TECHNIQUES

Disabling facial unlocking using facial expression

TIME DILATION CONTROLS FOR CAMERAS

Reporting road hazards using in-vehicle camera

MULTIPLE TIER LOW OVERHEAD MEMORY LEAK DETECTOR

ADAPTIVE USER INTERFACE BASED ON EYE TRACKING

Hardware Design with VHDL PLDs I ECE 443. FPGAs can be configured at least once, many are reprogrammable.

Technical Disclosure Commons

VOICE AND TOUCH BASED INPUT

Memory Overview. Overview - Memory Types 2/17/16. Curtis Nelson Walla Walla University

SYSTEMS AND METHODS FOR ROUTING COMMUNICATIONS IN A COMPUTER NETWORK

Memory & Simple I/O Interfacing

Automatic information protection when device camera is operated by secondary user

Machine Learning for the Quantified Self. Lecture 2 Basic of Sensory Data

Vehicle To Android Communication Mode

Computer Organization and Assembly Language (CS-506)

Basic Organization Memory Cell Operation. CSCI 4717 Computer Architecture. ROM Uses. Random Access Memory. Semiconductor Memory Types

BUFFERING AND INSERTING TEXT INPUTS

BANGLADESH UNIVERSITY OF ENGINEERING & TECHNOLOGY (BUET) DHAKA TECHNICAL SPECIFICATION FOR SUPPLY AND INSTALLATION OF LABORATORY EQUIPMENTS (PKG1).

Selecting native ad units using gestures

Displaying web pages within a software keyboard

MEMORY. Computer memory refers to the hardware device that are used to store and access data or programs on a temporary or permanent basis.

Programmable Logic Devices Introduction CMPE 415. Programmable Logic Devices

Components of a personal computer

Smart-Voice Invocation of Scenes in Home- Automation Systems

Detection of lost status of mobile devices

System Unit. By: Khadeeja Farkash

TINY System Ultra-Low Power Sensor Hub for Always-on Context Features

SYSTEM AND METHOD FOR FACILITATING SECURE TRANSACTIONS

Health Monitoring Using Fitness Band and IOT

Synchronizing Media Content In A Shared Virtual Reality Environment

Sir Sadiq s computer notes for class IX. Chapter no 4. Storage Devices

Allmost all systems contain two main types of memory :

BCN1043. By Dr. Mritha Ramalingam. Faculty of Computer Systems & Software Engineering

GENERAL SET-UP & APP PAIRING/SYNCING FEATURES BATTERY ACCOUNT & DEVICE SETTINGS PRIVACY WARRANTY GENERAL SET-UP & APP ANDROID

Notification Features For Digital Content In A Mobile-Optimized Format

On-Device troubleshooting for conferencing and enterprise equipment

Organization. 5.1 Semiconductor Main Memory. William Stallings Computer Organization and Architecture 6th Edition

Movit System G1 WIRELESS MOTION DEVICE SYSTEM

William Stallings Computer Organization and Architecture 6th Edition. Chapter 5 Internal Memory

Technical Disclosure Commons

The Challenges of Wearable Electronics

Module 5a: Introduction To Memory System (MAIN MEMORY)

Note: Text based on automatic Optical Character Recognition processes. SAMSUNG GALAXY NOTE

Microcontroller Systems. ELET 3232 Topic 11: General Memory Interfacing

Technical Disclosure Commons

Handwriting recognition for IDEs with Unicode support

THE MICROCOMPUTER SYSTEM CHAPTER - 2

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

REMOVABLE COMPUTER DATA STORAGE MEDIUM WITH VISIBLE MUTABLEBULK- PROPERTY INDICATION

Memory Study Material

Seamless transfer of ambient media from environment to mobile device

Semiconductor Memory Types Microprocessor Design & Organisation HCA2102

COSC 243. Memory and Storage Systems. Lecture 10 Memory and Storage Systems. COSC 243 (Computer Architecture)

WHICH PHONES ARE COMPATIBLE WITH MY HYBRID SMARTWATCH?

IoT starts impacting the MEMS market

An Eye Tracking System For Smart Devices

Memory and Programmable Logic

Chapter 5 Internal Memory

SERVICE LEVEL AGREEMENT ENFORCEMENT AND MEASUREMENT OF NETWORK SLICES

CREATED BY M BILAL & Arslan Ahmad Shaad Visit:

CS 320 February 2, 2018 Ch 5 Memory

STMicroelectronics is pleased to present the. Attend a FREE One-Day Technical Seminar Near YOU!

Digital Design, Kyung Hee Univ. Chapter 7. Memory and Programmable Logic

INSTALLATION AND OPERATING INSTRUCTIONS DSST SYSTEM

UNIT-V MEMORY ORGANIZATION

MicroProcessor. MicroProcessor. MicroProcessor. MicroProcessor

Lecture Objectives. Introduction to Computing Chapter 0. Topics. Numbering Systems 04/09/2017

Concept of Memory. The memory of computer is broadly categories into two categories:

GENERAL SET-UP & APP GENERAL SET-UP & APP PAIRING/SYNCING FEATURES BATTERY ACCOUNT & DEVICE SETTINGS PRIVACY WARRANTY. For IOS:

Micro:bit - an Educational & Creative Tool for Kids

Prediction and Selection of Sequence of Actions Related to Voice Activated Computing Systems

Logic and Computer Design Fundamentals. Chapter 8 Memory Basics

USER GUIDE EN / IT / ES / FR / RU

William Stallings Computer Organization and Architecture 8th Edition. Chapter 5 Internal Memory

Summarizing Existing Conversation For Newly Added Participants In Online Communication

Cycle Time for Non-pipelined & Pipelined processors

Discovering Computers 2012

Samsung Smartphones Linked Magnetically

Gesture-Based Controls Via Bone Conduction

DIGITAL SYSTEM FUNDAMENTALS (ECE421) DIGITAL ELECTRONICS FUNDAMENTAL (ECE422)

Smartwatches (April 12, 2017) Samsung Gear Live, 2014 Samsung S 3G, 2014 Samsung S3 LTE, November 2016

Microsemi Innovation Lab Grand Opening

Lecture-7 Characteristics of Memory: In the broad sense, a microcomputer memory system can be logically divided into three groups: 1) Processor

output devices. connected to the controller. data communications link. relay systems. user program. MECH1500Quiz1ReviewVersion2 Name: Class: Date:

Basic Computer Hardware Notes in PDF

Pulse. Multisport HR Fitness Tracker. Quick Start Guide. delvfire.com. ID115Plus HR

TEPZZ 5976 A T EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G08G 5/00 ( ) H04M 1/725 (2006.

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1

Introduction read-only memory random access memory

Chapter 4 Main Memory

INTRODUCTION TO INFORMATION & COMMUNICATION TECHNOLOGY (ICT) LECTURE 2 : WEEK 2 CSC-111-T

Overview of Microcontroller and Embedded Systems

COMP2121: Microprocessors and Interfacing. Introduction to Microprocessors

Embedded Training MakeICT Workshop

Computer Organization. 8th Edition. Chapter 5 Internal Memory

Voice activated spell-check

Input output and memory devices

Transcription:

Technical Disclosure Commons Defensive Publications Series June 05, 2017 EFFICIENT TRAINING FOR MACHINE LEARNING MODELS ON EMBEDDED AND/OR LOW-POWER DEVICES Tyler Freeman Follow this and additional works at: http://www.tdcommons.org/dpubs_series Recommended Citation Freeman, Tyler, "EFFICIENT TRAINING FOR MACHINE LEARNING MODELS ON EMBEDDED AND/OR LOW-POWER DEVICES", Technical Disclosure Commons, ( June 05, 2017) This work is licensed under a Creative Commons Attribution 4.0 License. This Article is brought to you for free and open access by Technical Disclosure Commons. It has been accepted for inclusion in Defensive Publications Series by an authorized administrator of Technical Disclosure Commons.

Freeman: EFFICIENT TRAINING FOR MACHINE LEARNING MODELS ON EMBEDDED AND/OR EFFICIENT TRAINING FOR MACHINE LEARNING MODELS ON EMBEDDED AND/OR LOW-POWER DEVICES ABSTRACT A machine learning system is described that enables an embedded and/or low-power device to locally train machine learning models using data collected by sensors of the device. When the device is being used by a user and operating on battery power (e.g., during the day), one or more sensors of the device may collect data. When the device is not being used by the user and is charging (e.g., at night), the device may use the collected data to train one or more machine learning models used by the device. In this way, the device may locally train machine learning models without degrading a user experience. BACKGROUND Training machine learning models may be a very computationally-intensive task, something a low-power or embedded device, like a smartwatch or smartphone, cannot handle without degrading the user experience and/or battery life. As such, training of machine learning models used by low-power or embedded devices is typically deferred to an external server (i.e., as opposed to training the model on the low-power or embedded device). However, training the model on a server requires transferring all the data collected on the local device to the external server for processing. For low-power devices with limited connectivity, such as smartwatches or smartphones in areas without reliable/cheap internet connection, sending large amounts of data can be prohibitive (e.g., either cost prohibitive or technically infeasible). Thus, machine learning on embedded/low-power devices is still a difficult problem. DESCRIPTION Techniques are described that enable a low-power and/or embedded device to perform local training of machine learning models. The device may include one or more sensors that collect data throughout the day. The collected data may be stored in a memory or other storage Published by Technical Disclosure Commons, 2017 2

Defensive Publications Series, Art. 542 [2017] device of the device. At the end of the day, a user of the device may connect the device to a charger to replenish a battery of the device. While the device is charging, one or more processors of the device may utilize the collected data to train one or more machine learning models used by the device. As such, by the time the user again desires to use the device, the battery may be fully charged and the machine learning models may be updated (i.e., trained). Consider system 1 (referred to simply as system 1 ), shown below in FIG. 1 that includes an example low-power and/or embedded device (referred to simply as device 2 ) configured to perform local machine learning model training, in accordance with the techniques described herein. System 1 includes device 2 and charger 3. 3

Freeman: EFFICIENT TRAINING FOR MACHINE LEARNING MODELS ON EMBEDDED AND/OR Published by Technical Disclosure Commons, 2017 4

Defensive Publications Series, Art. 542 [2017] While illustrated as a wearable computing device, device 2 may be any type of lowpower and/or embedded device. Examples of device 2 include, but are not limited to, smartwatches, visors (e.g., head-mounted computing devices), fitness trackers (e.g., wrist-worn, ankle-worn, waist-worn, or other devices worn by users that track various movement or other athletic activity), mobile computing devices (e.g., smartphones, tablets, and the like), or any other device that uses machine learning models. As shown in FIG. 1, device 2 may include processor(s) 4, sensor(s) 5, storage device(s) 6, power source(s) 7, and model training module 8. Processors 4 may implement functionality and/or execute instructions within device 2. Examples of processors 4 include, but are not limited to, digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Sensors 5 may generate data representative of various measurands. Examples of sensors 5 include, but are not limited to, microphones, cameras, accelerometers, gyroscopes, thermometers, heart beat sensors, pressure sensors, barometers, ambient light sensors, altimeters, and the like. Storage devices 6 may store information for processing during operation of device 2. As one example, storage devices 6 may store data generated by sensors 5. As another example, storage devices 6 may store machine learning models used by device 2. As another example, storage devices 6 may store an operating system, applications, or other programs for execution at device 2. Examples of storage devices 6 include, but are not limited to, volatile memories (e.g., random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art) and nonvolatile memories (e.g., magnetic hard discs, optical discs, floppy discs, solid-state memories such as flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories, and other forms of non-volatile memories known in the art). 5

Freeman: EFFICIENT TRAINING FOR MACHINE LEARNING MODELS ON EMBEDDED AND/OR Power sources 7 may provide power to one or more components of device 2. For instance, power sources 7 may include a battery that provides power to processors 4, sensors 5, and storage devices 6. Power sources 7 may be re-chargeable via charger 3. For instance, to recharge power sources 7, device 2 may be connected to charger 3 via link 9. Link 9 may be a wired link, such as a charging cable, or may be a wireless link, such as an inductive coupling. Power sources 7 may be sized such that device 2 may be operated all-day without requiring recharging. As such, device 2 may typically be connected to charger 3 at night, when device 2 is not likely to be used (e.g., as the user of device 2 is likely sleeping). As opposed to utilizing external devices (e.g., servers) to train machine learning models, device 2 includes model training module 8, which may be executable by processors 4 to train one or more machine learning models used by device 2. Model training module 8 may train the models based on data generated by sensors (e.g., sensors 5), user actions, and context. For instance, model training module 8 may be executable by processors 4 to train a machine learning model used to determine whether or not to activate a display of device 2 using motion data generated by sensors 5 and user actions that indicate whether or not the user viewed or otherwise interacted with the display when it was activated. In operation, while device 2 is not connected to charger 3, model training module 8 may store data for later use when training machine learning models. For instance, model training module 8 may be executable by processors 4 to collect data from sensors 5, user actions, and context (e.g., whatever data points are needed to train the model). Model training module 8 may be executable by processors 4 to cause storage devices 6 to store the collected data (i.e., data from sensors 5, user actions, and context) along with timestamps and/or contextual information. Model training module 8 may defer training of the model(s) based on the collected data until the device is charging. For instance, model training module 8 may monitor a status of device 2 (e.g., actively monitor, receive an interrupt, or the like), to determine when device 2 has been connected to charger 3. In some examples, machine learning module 8 may immediately begin training model(s) in response to determining that device 2 has been connected to charger 3. Published by Technical Disclosure Commons, 2017 6

Defensive Publications Series, Art. 542 [2017] In other examples, machine learning module 8 may evaluate one or more other parameters to determine when to initiate training of the models. As one example, model training module 8 may wait a period of time (e.g., 1 minute, 5 minutes, 30 minutes, 1 hour, 2 hours, etc.) after determining that device 2 has been connected to charger 3 before initiating training of the models. As another example, after determining that device 2 has been connected to charger 3, model training module 8 may wait until no user input is being received and/or no user input has been received for a period of time (e.g., 1 minute, 5 minutes, 30 minutes, 1 hour, 2 hours, etc.) before initiating training of the models. As another example, model training module 8 may use a machine learning model to predict when device 2 is likely to charged (e.g., be connected to charger 3) for periods of time long enough to perform model training. In this way, if the user just charges device 2 in the middle of the day for a quick refresh (e.g., 10 15 minute), model training module 8 may avoid having to halt a model training operation when device 2 is disconnected from charger 3. By deferring model training for when device 2 is not being used, these techniques enable device 2 to locally train machine learning models without degrading a user experience. For instance, as training machine learning models may consume more power than normal operation, training the models when device 2 is charging allows premature depletion of power sources 7 to be avoided. Additionally, as training machine learning models requires cycles of processors 4 and/or other system resources, deferring model training for when device 2 is not being used allows the cycles of processors 4 and the other system resources to be used for normal operation and/or user interactions. Similarly, as the model training is performed when the user is not interacting with device 2, device 2 may utilize higher amounts of processing power and/or other system resources than would be available for use if model training was not differed. Furthermore, by generally enabling embedded and/or low-power devices to locally train machine learning models, these techniques enable the power and benefits of machine learning to be brought to smaller, underpowered devices, and devices owned by users who do not have reliable or cheap connections to the internet for static model training on a remote server. 7