An Arizona security company is working on an interesting approach to mobile authentication, one that leverages the exact angle a user holds the phone
An Arizona security company is working on an interesting approach to mobile authentication, one that leverages the exact angle a user holds the phone as a means of making replay attacks a lot more difficult. Aetna has been testing the method internally (according to the security company’s CEO) and the company — Trusona — has announced about $18 million in funding, from Microsoft Ventures ($10 million) and Kleiner, Perkins, Caufield and Byers ($8 million).
The Microsoft Ventures funding is interesting because one of the more popular mobile authentication methods today is Microsoft’s Authenticator app. Is Redmond covering its bases, or does it see the Trusona effort as threatening to displace Authenticator, at least in the enterprise IT world?
The basic approach of Trusona is simple enough: The user opens the app on a mobile device — where the more robust authentication kicks in — and uses it to scan a QR code on a desktop device. It also allows for users to authenticate themselves by scanning a driver’s license or passport, with the phone noting the exact distance and angle of the phone and its camera to again try to block a replay attack.
Unlike Authenticator and the much-maligned texting of confirmation codes (which is highly susceptible to man-in-the-middle attacks), this approach does away with the user having to type in any digits or characters. Not only does this no-text approach reduce the burden on the user as well as avoid the risk of typos, but, argues Trusona CEO Ori Eisen, “if you’re not typing, a keylogger can’t listen in” and capture the keystrokes.
Mobile devices are amazing things and few companies have even begun to leverage a tiny portion of all that they can do — and that desktop systems can’t. From a mobile security perspective, leveraging as many of those tracking elements as possible is where there is so much authentication potential. In reviewing one of the patent applications Trusona has filed — looks like it has filed quite a few — it’s clear that Trusona has determined that there’s gold in them thar sensors.
Among the mobile authentication factors it is considering are “operational parameters of the camera or an imaging sensor of the imaging device at the time an image is captured or a barcode is scanned,” the application said. “For instance, a state of the camera may include the exposure time, ISO speed rating, focal length, use of flash, light balancing, resolution, or any other information that may be varied with time, environment conditions such as lightening, orientation or position of the camera. The user device may have one or more sensors on-board the device to provide instantaneous positional and attitude information of the imaging device. In some embodiments, the positional and attitude information may be provided by sensors such as a location sensor (e.g., Global Positioning System (GPS)), inertial sensors (e.g., accelerometers, gyroscopes, inertial measurement units (EVIUs)), altitude sensors, attitude sensors (e.g., compasses) pressure sensors (e.g., barometers), and/or field sensors (e.g., magnetometers, electromagnetic sensors) and the like.”
The application options continued: “location sensors (e.g., global positioning system (GPS) sensors, mobile device transmitters enabling location triangulation), vision sensors (e.g., imaging devices capable of detecting visible, infrared, or ultraviolet light, such as cameras), proximity sensors (e.g., ultrasonic sensors, lidar, time-of-flight cameras), inertial sensors (e.g., accelerometers, gyroscopes, inertial measurement units (IMUs)), altitude sensors, pressure sensors (e.g., barometers), audio sensors (e.g., microphones), time sensors (e.g., clocks), temperature sensors, sensors capable of detecting memory usage and/or processor usage, or field sensors (e.g., magnetometers, electromagnetic sensors). Sensors of different types may measure different types of signals or information (e.g., position, orientation, velocity, acceleration, proximity, pressure, etc.) and/or utilize different types of measurement techniques to obtain data. For instance, the sensors may include any suitable combination of active sensors (e.g., sensors that generate and measure energy from their own source) and passive sensors (e.g., sensors that detect available energy).”
The application also specifically explored how the end user is holding the phone. Make that very specifically: “Positional information may include a latitude, longitude, and/or altitude of the imaging device. The positional information may include an orientation of the imaging device. For instance, the positional information may include an orientation of the device with respect to one, two, or three axes (e.g., a yaw axis, pitch axis, and/or roll axis. The positional information may be determined relative to an inertial reference frame (e.g., environment, Earth, gravity), and/or a local reference frame. The positional information may include movement information of the imaging device. For instance, the positional information may include linear speed of the device or linear acceleration of the device relative to one, two, or three axes. The positional information may include angular velocity or angular acceleration of the device about one, two, or three axes. The positional information may be collected with aid of one or more inertial sensors, such as accelerometers, gyroscopes, and/or magnetometers.”
It then included my personal favorite: Activating the microphone and seeing what background sounds it picks up: “Environmental information collected by the user device at the time an image is captured. The environmental information may include audio information collected by a microphone of the device. The environmental information may include information collected by a motion detector, an ultrasonic sensor, lidar, temperature sensor, pressure sensor, or any other type of sensor that may collect environmental information about the device. The environmental information may include detecting the touch or hand position of a user holding the device, and collecting which portions of the device are touched or held by the user.”
That last part, about exactly how the user touches the device and specifically any icons that appear on the screen, is a wonderful touch — no pun intended. No user would ever touch a screen in precisely the same way.
Eisen argues that the premise of many of today’s authentication mechanisms — where an exact match to the credential is sought — is ludicrous. Indeed, it plays into the hands of cyberthieves who sniff or otherwise steal an authenticated session. “A 100 percent match is useless,” said Eisen, whose background includes stints serving as the director of worldwide fraud for American Express, director of fraud for Verisign and VP/MIS for Bank of America.
“Your finger touches the screen ever so differently,” Eisen said. “You’re not clicking it perfectly on the same pixel. We can time it down to the millisecond and factor in the angle of the screen. You’re getting it down to a singularity.” In other words, add enough mobile sensor data to any authentication session and it gets close to impossible to credibly replicate a session. If it is an exact match — which would be quite difficult to do — it would do little other than confirm that it’s a fraud attempt.
Eisen claims more than 750 current customers and his firm has a differentiated payment position, which has shifted over the last few years. Today, Trusona is offering its app for free, which Eisen anticipates going for small-to-medium-size businesses. For bigger businesses — such as the $61 billion Aetna, whose testing is being overseen by Chief Security Officer Jim Routh, Eisen said — Trusona is offering its SDK so that companies integrate the Trusona functionality into the company’s own app. That service, placing it in the company’s own app, is being charged at about $1 per user per year, Eisen said.