...
Valter Nordth– Supporting GÉANT in updating the terms of reference for the technical programme. Plans are to present a draft for the next GA in September. Some TTC members’ terms have expired; Valter proposed to prolong the expired mandate until the end of 2015. No objections were raised.
...
Peter Schober–
...
IDM Issues in the R&E community
PSc, as part of the more in depth area presentation each TTC member offers, gave an overview of overview of the authentication and authorisation practices in the R&E community.
There is still a lot of phising despite users being asked phising and asking subjects to use more and more more complex passwords obviously won't help there. Mitigation for this are strong authentication, 2-factor authentication, multi-factor authentication, which in practice means a combination of independent authentication practicesmethods or technologies.
Ubikey Yubikey and Google have championed joined the FIDO alliance promoting 2-factor authentication , that basically uses established technologies (U2F: "Universal 2nd Factor") specs that use established technologies (RSA public key cryptography) and protocols that are now being integrated in into the browser.
Most of the requirements for 2-factor authentication come from the users in the attempt to protect their passwords rather from the resources.
Despite what many believe, the second factor authentication is not really a way to increase the assurance that the credentials are used by the good right people. To elevate the insurance other means are needed, i.e. verified process etc. which normally bring up the authentication costs.
A problem institutions still face is the request for password reset, which is still a time consuming operation . To date there is no fully automated way to do that as the new passwords have to propagated into the different databasesand affects identity assurance.
PSc touched upon authorisation, which usually presupposes the user has been previously authenticated.
Identity management in the academic space is very complex as there are lots of different roles (and a combination of them at different levels) to handle. For some applications services authentication and authorisation can collideoverlap, but in general this is not a good practicespractice. Commercial companies are expanding the authentication process with data mining, and use the results of this for authorisation purposes. This mix the authentication and the authorisation processes which is what federated access separated in the first place.taking into account an ever growing list of contextual and environmental factors (OS, browser fingerprinting, IP addresses, geolocation, etc.)
Academic licenses can be complex so it is very difficult to translate them into operational procedures. An example of this is Clarin where the authorisation parameter chosen is to allow access to resources to "academics". They basically mapped a licence term grant of rights limited to specific uses ("for educational, teaching or research purposes") into an authorisation process, not realising that there is no generally agreed upon concept (nor machine-consumable information for ‘academic’in institutional IDM systems) for "academic". This approach is causing problems, as it's based on a fundamental misconception: No IDM process/authorisation attribute can ever give the license holder the assurance that the subject accessing the resource will be using it in accordance with the license terms.
PSc also touched upon provisioning, the process to make sure that data to be used in distributed environments are available in different places. One approach is to push the data to all applications for when they are needed ("just in case"); this model has issues with federations. The other approach is to provision the data when needed (just in time), which has issues with authorisationfederated approaches as the number of applications might be huge, rapidly changing or unkown in advance.. The other approach is to provision the data when needed ("just in time"), which has issues with authorisation (e.g. authorising someone can only happen after they has been provisioned a local account for a person), resulting in awkward workflows. E.g. having to ask (and wait for) a group of people to log in to a system first (in order to get their accounts provisioned "just in time"), at which point those subjects do not have access to the resource, and then authorise them later (and ask the subjets to return after they have been properly authorised).
De-provisioning is normally not properly done, though there's some support in the protocols used; the general approach followed is to reset the passwordthe password at the Identity Provider (and leave the data at SPs to rot).
The last part of the talk covered attributes and its usage. Typical problems in this area:
- Agreeing on the structuresyntax and semantics
- The complexity of storing and processing Humans names from different culturesHumans names and the accents
- Identifiers and their many properties
- Who gets the attributes the IdP releases
- Who decides based on what.
Currently the R&E community is using two main approaches or even a mix of them: a risk-based approach (REFEDS R&S) vs a full compliance one (GÉANT CoCo).
Lastly Peter touched upon eduGAIN and related services offered by GÉANT. Thanks to the work done by the community within the GÉANT project and within REFEDS, it is now much easier for an NREN to create a federation: there policy 's a federation policy template, best practice documents, there is FaaS (video showcase) that offers a SAML entity registry, metadata aggregation, a metadata aggregation +HSM, (but no support for the local installationplus secure signing with a HSM (which makes support for local installation impossible), information on entity categories, discovery documentation as and so on.
PSc’ s presentation covered many interesting aspects; some TTC members asked which areas NRENs are really focusing on.
ACTION: Peter to review his slides and distil distill what is being worked on and what is not being worked on.
...