Tuesday, July 28, 2015

NAPPS has left the building (but is still on the front lawn)

A good standards effort defines specifications that build on the existing stack of underlying protocols, cryptographic techniques, data formats and platform capabilities. A better standards effort defines specifications that can adapt accordingly as that existing stack changes and evolves. The very best standards efforts know when to announce victory, pack their bags, and go home when that stack evolves in such a way to mitigate the value of the standard in the first place.

By this measure, NAPPS, the OIDF WG chartered to define mechanisms in support of an SSO experience for native applications, is an awesome standards effort.

As has been previously pointed out by John Bradley and myself, the mobile OSs are evolving their support for native SSO, both iOS and Android are adding new features that make SSO possible 'out of the box', without the introduction of specialized application software on the device, as the NAPPS group had been proposing. Consequently, the value of the 'Token Agent' model that NAPPS was proposing and standardizing is diminished - fundamentally we don't need to supplement the mobile OSs to achieve native SSO when they provide sufficient capabilities on their own.

Consequently, as John writes, the NAPPS WG is 'pivoting' and, rather than delivering a normative specification for the Token Agent role, will instead:

"...document best practices for Single Sign-on for Enterprise and Software as a Service Providers using these new features in combination with the PKCE specification, as well as filling in any remaining gaps to allow SaaS providers to fully support OAuth and OpenID Connect enabled native applications in a secure way without forcing users into extra unproductive logins."

NAPPS_blog.png
In addition to these sort of guidelines, there is discussion about the development of open source SDKs that would wrap up all these features and flows - simplifying for application developers how to hook into this native SSO model. Discussions are underway as to where development of these libraries make sense.
Interestingly, while the value of a Token Agent has been marginalized by the new mobile OS features for the native SSO use case, the TA model may yet find a home in the Internet of Things.

Many IoT devices are characterized by limited UI capabilities for display and user input - both of which are critical for the initial binding of the device to a user account and corresponding provisioning of credentials. But if Things are constrained in this way, mobile devices aren't - and so can facilitate this initial setup step.

Shown here is a scenario where a native application on a device plays the role of a Token Agent on behalf of a Thing. The TA obtains an OAuth access token for the Thing and then delivers that token using some short range wireless protocol such as BLE or NFC. Once the Thing has its token, it can use that to authenticate itself when interacting with cloud endpoints or even other Things.


Should the TA model be eventually applied to IoT use cases, perhaps my not insignificant $$ investment in a large supply of 'There is nothing token about my agent' t-shirts will not be wasted. Let us hope.

Thursday, March 26, 2015

NAPPS - a rainbow of flavours

Below is an arguably unnecessarily vibrant swimlane of the proposed (Native Appplications) NAPPS  flow for an enterprise built native application calling an on-prem API.

The very bottom arrow of the flow (that from Ent_App to Ent_RS) is the actual API call that, if successful will return the business data back to the native app. That call is what we are trying to enable (with all the rainbow hued exchanges above)

As per normal OAuth, the native application authenticates to the RS/API by including an access token (AT). Also show is the possibility of the native application demonstrating proof of possession for that token but I'll not touch on that here other than to say the corresponding spec work is underway).

What differs in a NAPPS flow is how the native application obtains that access token. Rather than the app itself taking the user through an authentication & authorization flow (typically via the system browser), the app gets its access token via the efforts of an on-device 'Token Agent' (TA). 

Rather than requesting an access token of a network Authorization Service (as in OAuth or Connect), the app logically makes its request of the TA - as labelled below as 'code Request + PKSE'. Upon receiving such a request from an app, the TA will endeavour to obtain from the Ent_AS an access token for the native app. This step is shown in green below. The TA uses a token it had previously obtained from the AS in order to obtain a new token for the app. 

In fact, what the TA obtains is not the access token itself, but an identity token (as defined by Connect) that can be exchanged by the app for the more fundamental access token - as shown in pink below. While this may seem like an unnecessary step, it actually

  1. mirrors how normal OAuth works, in which the native app obtains an authz code and then exchanges that for the access token (this having some desirable security characteristics)
  2. allows the same pattern to be used for a SaaS app, ie one whether there is another AS in the mix and we need a means to federate identities across the policy domains. 




When I previously wrote 'TA uses a token it had previously obtained from the AS', I was referring to the flow coloured in light blue above. This is a pretty generic OAuth flow , the only novelty is the introduction of the PKSE mechanism to protect against a malicious app stealing tokens by sitting on the app's custom URL scheme.


Friday, November 07, 2014

Application unbundling & Native SSO

You used to have a single application on your phone from a single social provider, you likely now have multiple.

Where the was Google Drive, there is now Sheets, Docs, and Slides - each individual application optimized for a particular document format.

Where the chat function used to be a tab within the larger Facebook application , there is now Facebook Messenger - a dedicated chat app.

LinkedIn has 4 individual applications.

The dynamic is not unique to social applications.



 According to this article
Mobile app unbundling occurs when a feature or concept that was previously a small piece of a larger app is spun off on it’s own with the intention of creating a better product experience for both the original app and the new stand-alone app.
The unbundling trend seems mostly driven by the constraints of mobile devices - multiple functions hidden behind tabs may work on a desktop browser, but on a small screen, they may be hidden and only accessible through scrolling or clicking.

That was the stated justification for Facebook's unbundling of Messenger
We wanted to do this because we believe that this is a better experience. Messaging is becoming increasingly important. On mobile, each app can only focus on doing one thing well, we think. The primary purpose of the Facebook app is News Feed. Messaging was this behavior people were doing more and more. 10 billion messages are sent per day, but in order to get to it you had to wait for the app to load and go to a separate tab. We saw that the top messaging apps people were using were their own app. These apps that are fast and just focused on messaging. You're probably messaging people 15 times per day. Having to go into an app and take a bunch of steps to get to messaging is a lot of friction.
Of course, unbundling clearly isn't for everybody ....



I can't help but think about unbundling from an identity angle. Do the math - if you break a single application up into multiple applications, then what was a single authentication & authorization step becomes multiple such steps. And, barring some sort of integration between the unbundled applications (where one application could leverage a 'session' established for another) this would mean the user having to explicitly login to each and every one of those applications.

The premise of 'one application could leverage a session established for another' is exactly that which the Native Applications (NAPPS) WG in the OpenID Foundation is enabling in a standardized manner. NAPPS is defining both 1) an extension and profile of OpenID Connect by which one native application (or the mobile OS) can request a security token for some other native application 2) mechanisms by which the individual native applications can request and return such tokens.

Consequently, NAPPS can mitigate (at least one of) the negative implications of unbundling.

The logical end-state of the trend towards making applications 'smaller' would appear to be applications that are fully invisible, ie those that the user doesn't typically launch by clicking on an icon, but rather receives interactive notifications & prompts only when relevant (as determined by the application's algorithm). What might the implications of such invisible applications be for identity UX?







Wednesday, November 05, 2014

Sticky Fingers

Digits is a new phone-number based login system from Twitter.
Digits is a simple, safe way of using your phone number to sign in to your favorite apps.
Note that Digits is not just using your phone to sign in (there are a number of existing mobile-based systems), but your phone number. 

Digits is an SMS-based log in system (unlike mobile OTP systems like Google Authenticator). When trying to login to some service, the user supplies their phone number, at which they soon receives an SMS, this SMS carrying a one-time code to be entered into the login screen. After Twitter's service validates the code, the application can be (somewhat) confident that the user is the authorized owner of that phone number.

Now, the above makes it clear that Digits relies on only a single factor, ie a 'what you have' of the phone associated with the given phone number. This post even brags that you need not worry about any additional account names or passwords. But that same post claims that Digits is actually more than a single factor
Digits.com, an easy way for your users to manage their Digits accounts and enable two-factor authentication
As much as I squint, I can see no other factor in the mix. (And it sure isn't the phone number.)

Digits apparently also has privacy advantages.
Digits won't post on your behalf, so what you say and where you say it is completely up to you
Well, to be precise, Digits can't post on your behalf ... And is it not somewhat ironic that Twitter touts as an advantage of Digits the fact that it is not hooked into your Twitter account??

Presumably this is presented in contrast to the existing 'Sign-in with Twitter' system, use of which can allow a user to authorize applications to post to Twitter on their behalf (as the system is based on OAuth 1.0).

But of course, 'Sign-in with Twitter' allows applications to post on behalf of users only because Twitter made the business decision to make this permission part of the default set of authorizations. Twitter could have chosen to make their consent more granular and tightened up the default.

Dick Hardt analyzed Digits and hilited two fundamental issues of using phone numbers as identifier


  1. the privacy risk associated with a user presenting the same identifier to all applications (as it enables subsequent correlation amongst those applications without the user's consent). It's pretty trivial to spin up new email addresses (even disposable ones) to segment your online interactions and prevent correlation. Is that viable for phone numbers?
  2. that applications generally aren't satisfied with only knowing that who a particular user is, but almost always want to know the what as well, ie their other identity attributes, social streams etc

Dick, having made the second point, perversely then conjectures that it may not be an issue
as mobile apps replace desktop web sites, the profile data may not be as relevant as it was a decade ago
I can't imagine why the native vs browser model would impact something as fundamental as wanting to understand your customer?  

Twitter actually tries to position this limitation as a strength of Digits
Each developer is in control with Digits. It lets you build your own profiles and apps, giving you the security of knowing your users are SMS-verified. 
The motivation for Digits.com becomes a bit clearer when you read more
We built Digits after doing extensive research around the world about how people use their smartphones. What we found was that first-time Internet users in places like Jakarta, Mumbai and São Paulo were primarily using a phone number to identify themselves to their friends.
Twitter must have looked at their share in these markets and determined they needed a different way to mediate user's application interactions.

Source - http://stats.areppim.com/stats/stats_socmediaxtime_afr.htm











Tuesday, October 28, 2014

Less is more

I attended GigaOM's Structure Connect conference in San Francisco last week. The event was great, lots of interesting discussions & panels.

I was in a 'Securing the IoT' breakout session where one of the GigaOM analysts made the assertion (paraphrasing)
Developers need better training on security, they need to take more responsibility for securing their applications.
This actually runs completely counter to what I've been seeing as the overarching trend around application security, namely that developers need to take (or even be given) less responsibility for securing their applications - not more.

If their app has to handle $$, do developers directly track currency exchange rates? No, they find an API that does that and so removes them from a task secondary to that of the application itself. The currency API abstracts away from the developer all the messiness - they make a simple REST call and get back a tidy bit of JSON to parse & use.

From the developers point of view, why would security be different? Do they want to deal with the specific details of supporting different authentication protocols, crypto etc. Or would they prefer to focus on adding features and functionality to their apps?

The trend towards lightening the security load for developers manifests in various ways

  • Social 'Login with X' SDKs - the large social providers make it as easy as they can for native application developers to hook into their identities. For instance, Facebook Login promises
The Facebook SDK for iOS provides various login experiences that your app can use to authenticate someone. This document includes all the information you need to know in order to implement Facebook login in your iOS app.
Google has the comparable the Google+ Sign-In, the documentation for which asserts
Avoid the hassle of creating your own authentication systemGoogle+ Sign-In mitigates data-breach risks and reduces the burden and costs of identity security. The Google authentication window creates a trusted link between you, your users, and their Google account.


  • REST gateways - many enterprise REST APIs are fronted by a gateway that intercepts incoming calls from clients and applies processing before delivering the call on to the actual API. The API developer need not directly deal with the authentication tokens attached to the original call, insulated from that burden by the gateway. Instead the gateway 
  • IDaaS - or Identity as a Service, is the trend of enterprises moving out to the Cloud certain identity & authentication mechanisms (just like many other enterprise functions are being outsourced). Rather than directly dealing with user provisioning, federation, or password vaulting etc the enterprise subscribes to the services of an IDaaS provider. The IDaaS takes on the full burden of the complexity of dealing with multiple protocols, business partners, customers, SaaS etc and offers back to the enterprise developer a much simpler integration proposition.
The above are all examples of freeing application developers from having to bear full responsibility for securing APIs & native applications. And last I checked, both will be relevant for the Internet of Things. Freed from the burden of security, IoT developers will be able to focus their attention where they should - namely creating new & interesting visual paradigms for my wearable step data.



Wednesday, October 08, 2014

Social Media 2 Factor authentication


Premise

A user can authenticate to a web application (or a federation server) by sending an update (tweet, Facebook update, etc) with a randomly generated hashtag previously delivered to the user in the login interface. 

The fundamental requirement is that 

  1. the user be able to demonstrate ownership of the social account previously connected to their account at the authentication server by including a challenge string in a tweet, update etc
  2. the authentication server be able to determine that a particular challenge string was added to a tweet, update etc associated with a particular social account 

User Experience


Step 1 :


User binds their social account to the authentication server

Screen Shot 2014-05-22 at 3.12.58 PM.png

Alternatively, the ‘binding’ could consist solely of the user telling the authentication server their Twitter handle.

Step 2:


Later, User visits login page

User logs in with first factor, ie password, or SSO

Login UI displays randomly generated challenge string
Screen Shot 2014-05-22 at 3.33.01 PM.png

Authentication server stores away challenge string against that user’s account

Alternatively, the challenge mechanism could be via Twitter, ie the authentication server sends the user a tweet, and the User response would be a RT.

Step 3:


User sends tweet , including challenge hashtag from Step 2

Screen Shot 2014-05-22 at 3.35.27 PM.png

The response format & channel will depend on the nature of the challenge and how the user’s social media account were bound to the account at the authentication server.

Step 4:

After displaying the hashtag challenge to the user , the authentication server polls the user’s tweet stream (or equivalent) on some schedule for a tweet (or post) containing the challenge hashtag.

If such a tweet is found within some time period, the authentication page displays successful login.

Discussion


  1. The default would be for the user to manually type the challenge string into their tweet. Might it be possible for the authentication server to instead/also display a QR code, for the user to scan and so launch their mobile Twitter client with the tweet ready to send?
  2. Instead of a string, the challenge could consist of a link to a specific picture or some other media
  3. If the user has previously authorized other applications to be able to send tweets on their behalf, then those other applications would potentially be able to send a response tweet, but only if they were able to know the challenge. Consequently, the authentication model is likely only relevant for a 2nd factor, as having the user first authenticated with the other factor would prevent other applications from knowing the challenge string.
  4. if the authentication server were able to determine how many applications the user has granted the ability to tweet on their behalf, then conceivably it could factor that into it’s assessment of assurance
  5. There could be a viral component to the marketing of the authentication service, as the user’s followers would see the authentication tweets
  6. Is there a risk of violating Twitter ToS?

A symmetrical NAPPS model


The NAPPS WG in the OIDF is defining a framework for enabling SSO to native applications.

One challenge has been in supporting 3rd party native applications from large SaaS that already have an OAuth & token infrastructure (Salesforce as an example).

For this sort of SaaS, NAPPS has to allow the SaaS's existing OAuth AS to issue the token ultimately used by the app on the API calls.

The NAPPS spec is evolving to dealing with such applications in almost exactly the same way as it does native applications that call on-prem APIs built by the enterprise.

Fundamentally, for both categories of native applications, the enterprise AS issues to the Token Agent an identity token JWT, this handed to the application through the mobile OS bindings. The app exchanges this JWT for the desired access token to be used on API calls - the only difference is the AS at which the JWT is exchanged.

Local native apps
  1. app requests tokens of TA, includes generated nonce
  2. TA uses its RT to send request + nonce to AS
  3. AS returns PoP JWT
  4. TA hands over PoP JWT to app
  5. App exchanges JWT, shows PoP
  6. AS returns token(s) to app
3rd party native apps 
  1. app requests tokens of TA, includes generated nonce
  2. TA uses its RT to send request + nonce to AS1
  3. AS1 returns PoP JWT, targeted at AS2
  4. TA hands over PoP JWT to app
  5. App exchanges PoP JWT against AS2, shows PoP
  6. AS2 returns token(s) to app
Step 5 in the 3rd party sequence implies a federated trust model - the SaaS AS2 must be able to trust & validate the JWT issued by the enterprise AS1.


The above model is attractive for the symmetry it provides between both application categories.

Tuesday, October 07, 2014

As long as X is true .....


When my Samsung Gear watch is within BLE range of my Samsung S5, I need not enter my screen unlock pattern in order to get into the phone. The S5 interprets the proximity of the Gear as a proxy for my own proximity, and so deduces that it is myself handling the phone and not somebody else. 

This is an example of what appears to be an emerging model for authentication, which I’ll give the pretentious name of ‘conditional session persistence’ and characterize as

‘As long as X is true, no need to Y’

where ‘X’ is some condition - the continued state of which protects the user from having to perform ‘Y’, generally some sort of explicit login operation.

For my Gear & S5 use case, the X condition is ‘the Gear is within BLE range of the S5’ and the ‘Y’ is ‘demonstrate knowledge of secret unlock pattern to access phone’.

This authentication model is appearing elsewhere.

Screen Shot 2014-10-06 at 12.47.17 PM.pngThe Nymi wristband records the user’s ECG and sends it to a companion app on a paired device for it to be compared to the previously recorded ECG pattern. If the biometric comparison is successful, then the companion application responds back to the Nymi that it should unlock a previously registered crypto key and use that key to authenticate to resources and services. To ‘authenticate’ to the Nymi the user must touch a finger of the other hand to the top of the wristband - this creates an electrical loop that allows the ECG to be recorded. Once recorded and successfully compared, the ECG is not measured again, at least not until the wristband is removed from the user’s wrist. As long as the wristband stays on the user’s wrist the Nymi remains willing to assert the user’s identity by presenting the key (or presumably separate keys for different resources). Once removed from the wrist, then the user is required to re-authenticate once more via their ECG.

The Apple Watch is reported to use the same model.

Screen Shot 2014-10-06 at 12.03.26 PM.pngOn the back of the case, a ceramic cover with sapphire lenses1 protects a specially designed sensor that uses infrared and visible-light LEDs and photodiodes to detect your heart rate.

Via the 4 sensors on the back, the Watch will be able to determine when it is removed from the wrist after an initial authentication (by PIN it seems but it’s not inconceivable that it uses the heart rate as a biometric?). As long as the Watch stays on the user’s wrist the original authentication remains valid and the Watch can be used to, for instance, buy over-priced coffees from hipster barristas.

What is novel in this new model is of course the ‘As long as X is true’ clause - some sort of continuous check of the user’s context that serves to better bind them to the original authentication.  

Contrast this new model with traditional web-based authentication, in which, after the user presents some password (inevitably derived from their favourite sports teams name) the authentication server sets some session cookie

‘As long as T seconds haven’t expired, no need to Y’

In this model, nothing binds the user to the authenticated browser session and so prevents somebody else from hijacking that session - (which if of course is why those who (perversely) login to their banks and other sensitive resources from public kiosks are reminded to sign out when done).

Even in this new model, there will be differences in the certainty with which the persistence of X can be determined - the Nymi and Apple Watch, because they more tightly bind the user to the authenticating device, would likely offer more assurance than the Samsung Gear (I can take the Gear off my wrist and the S5 will be oblivious).

Of course, the ‘As long as X’ condition is only viable if there are local sensors able to monitor the state of X - whether Bluetooth proximity, or skin contact, or heart rate measurement, or future buttock-to-sofa contact etc. 

But fortunately the things that we are more and more surrounding ourselves with, even if primarily intended for some other purpose (think light bulbs, thermostats, and garage doors), will provide those sensors and so the ability to monitor all the different X’s we can think up.