Tuesday, June 20, 2017

ID2020 Summit

Yesterday, I attended the ID2020 summit held at the UN and at Microsoft office near Times Square. It was a great meeting of folks from humanitarian and private sector focused on identity as a fundamental human right.

I thought the following points stood out (obviously with my bias :) ):

Identity is a fundamental human right.

The ID2020 conference was about that obviously, but come to think of it, is there anything that can be done in the world today without an identity? Getting a mobile phones, which is probably the universal device every one has, requires an identity.

Human identity needs to be decoupled from national identity

If identity is a human right like food or water, then it should probably be decoupled from national identity. Could there be a truly globally distributed identity system? But is that even possible? And is that really needed? A government issued identity probably is more valuable to a farmer than a fully distributed identity. Obviously, food for thought.

Vaccination card is often the first identity of a human being

Almost every child in the world is vaccinated. A vaccination card is often the first identity for a child in a world where birth certificates do not exist.

Simplicity is the king - for identity too

If there is one lesson from Aadhaar, India's identity system for a billion people, it is that simplicity is the king. An identity system that tries to meet requirements for everyone cannot work. Instead, a simple system which provides citizens with a digital identity that is verifiable, and which allows application to be built on top using derived identities is probably the way to go.

Of course, in a system like Aadhaar, the issuing authority is the government. What if the government turns on its citizens?

A human should own their identity

There is a reason phrases in English exist such as, "I have my own identity". A human should own their identity and the data associated with it, and provide consent to it.

Identity is for things too

While identity for humans is important, the things in the "Internet of things" need an identity too.

Verification / attestation of digital identity

When identity goes digital, the need for verifying identity arises. Aadhaar, for example, has 600 Million verifications per month. Similarly, identity attestation is needed for various applications

The above list is just my recollection; there were many interesting discussions in different focus groups.

Onwards to digital identity.

Tuesday, August 23, 2016

Solving Developer's Security Challenges Through A Secure DevOps Pipeline (SDO)

The previous post described the challenges a developer faces in securely deploying the applications and managing their steady-state security. The difficulties arise, every security and compliance capability needed by developers is not available as a service consumable through an API (e.g., invoking an API to start a penetration test). As a consequence, the developer has to determine all the details of security and compliance for its application, which is no easy task.

Before we understand various security and compliance capabilities needed by developer, we must understand what is the life cycle of an application.

Life Cycle of an Application

An application life cycle comprises, development and build, deployment and testing, running and updates, and finally decommissioning. The following figure shows a typical application life cycle.

We also refer to this life cycle as the DevOpos pipeline. In an ideal DevOps pipeline, developers develop their applications, package their application components and dependencies using technologies such as Docker to create "immutable code", deploy them using templates such as Docker Compose, run them in a cloud, deploy updates continuously, and may ultimately decommission an application, partly or entirely.

DevOps Tension with Security and Compliance

This "continuous development and deployment" cycle creates tensions with the conventional security approaches. For example, applications comprising multiple components (e.g., webserver and database server) need "keys" or "passwords" for communication, yet these "keys" and "passwords" often end up in continuous integration tools such as Jenkins or Travis often with limited or no access control, or worse in code. Moreover, the "immutable code" images may end up in public repositories. Worse, it may be a requirement to not have any security credentials as part of the "immutable image"; such credentials must only be available at run-time. Also, source code scan, network and malware scans may not be performed before every continuous update, leading to security holes.

Secure DevOps Pipeline

How can a system possibly alleviate the difficulties developers face in securely deploying applications and managing their steady-state security? The answer is to convert every possible security function, whether it is a capability or an advisor into a service that is easily consumed by a developer with little or no effort from the developer. If achieved, this is the ultimate "Secure DevOps Pipeline", where every security and compliance function is consumable through an API.

So what are the security and compliance capabilities needed in a Secure DevOps Pipeline? Here is a pictorial rendition of key capabilities needed in such a pipeline:


Secure DevOps - Develop and Build Phase

Application development typically starts with a problem statement, design, and initial prototypes. The prototypes and subsequent revisions targeted for production comprise code written by application developers. The code may also have dependencies such as libraries, dependent packages, and so on.

TRUSTED CODE capability provides an implies that all application code and its dependencies are from trusted repositories.

The source code written by application developer is stored in trusted and managed repositories. Such repositories may have been configured with some automatic code scans on every code commit.

When an application developer builds an immutable image (Docker image) for its application source code and dependencies, and may run package installation tools such as apt-get install XYZ, the dependent packages may also get installed. If the version and source for those dependent packages is not controlled, old packages with potential vulnerabilities get installed or worse malware  may creep in. The job of the "Trusted source" capability is to ensure that application dependencies are from known or trusted sources.

In image formats such as Docker images, a developer has complete control over which repositories to use. Without any restrictions, a developer may download code or dependencies from anywhere, leading to the first chink in the security posture: is the code even trusted? The job of the "Trusted Code" function is to apply restrictions on the source and version of application code as well as dependent packages.

Trusted code is easier said than done. As mentioned above, it involves restricting developers to a set of trusted repositories, not all of which may satisfy dependencies for an application. However, verifying code / package and dependencies and adding them to trusted repositories is a continuous process. Overtime, the curated repositories will contain packages that should satisfy developers needs.

CODE SCAN capability scans the source code written by developers to ensure that it complies with security best practices. This capability is especially needed if "Trusted Code" capability is not available.

Developers are lazy. They will typically take the shortest route to complete their task. Without adequately enforcement of code best practices, errors may creep in the code that lead to security holes, e.g., SQL injection.

CODE / PACKAGE VULNERABILITY determines vulnerabilities in dependent code used by an application. While source code written by developers may get appropriate scanning as part of "Code Scan" service, the vulnerabilities in dependent code / packages need to be identified through such a service.

An application may have numerous dependencies that are satisfied through numerous package management tools. Typically, such package management has been associated with utilities such as "apt-get install" or "yum install". However, modern applications make use of several package managers such as python pip or Node.js npm. Worse, package managers may not always exist, e.g., .jar packages available on bintray. While operating system distributions such as Red Hat or Ubuntu regularly publish security bulletins for packages published in their repositories, the same cannot be said for every package manager. Consequently, it may not be possible to accurately determine vulnerabilities associated with each package.

Nevertheless, such a service is crucial in identifying known vulnerabilities.

APPLICATION CONFIGURATION VALIDATION is required to determine the correctness of application configuration.

Modern applications are complex. They comprise hundreds and sometimes thousands of configurations. Getting application configuration right from security perspective when it is deployed through layers of automation is extremely hard.

An application configuration validation is a service that validates the application or its component configuration from a security (or potentially performance) perspective and alerts the developer of incorrect configurations before an application is deployed into production.

MALWARE / VIRUS SCAN is a service that determines if any malware or virus has crept into the immutable image of an application build.

LICENSE VERIFICATION is a service that validates the license of application components being deployed.

License verification is needed for two reasons. First, the service ensures that only components with known licenses are deployed. For a capability delivered as a service, this is less of a concern since the software components having most restrictive software licenses such as GPL can potentially be used in delivering the service.

The other reason for license verification is appropriate charge back. For traditional enterprise software, it is important to determine the appropriate licenses for software being deployed. However, charge back for deployed software becomes less of a concern as traditional software is increasingly delivered as a service.

DEVELOPMENT / AUTOMATION CREDENTIAL MANAGEMENT is critical in ensuring good security practice. Building an immutable image for an application is often done through automation (Jenkins and Travis). Ensuring appropriate accesses for such credentials and key management is critical for good security hygiene.

Similar to user authentication, authorization, and access, any credentials required for automation must be kept in a credential or key store which is delivered as service.

Secure DevOps - Deploy and Test Phase

Application deployment and testing happens continuously and iteratively with development, and eventually running. Following key capabilities during deployment and testing, delivered through an API, can help reduce the "security" burden on a developer.

APPLICATION PATTERN allow developers to specify how various components of an application are combined together, typically over network, in delivering the application function. Such specification is then run by an deployment engine.

Cloud platforms provide numerous ways of specifying a template for application deployment. These templates vary in order of flexibility and ease of use. Typically, the templates that are highly flexible make it onerous for developers to specify correct security properties.

From a security perspective, the application deployment specifications need to follow "secure by default" principle while providing flexibility to the developers to override any details.

APPLICATION CREDENTIALS AND KEYS need to be managed similar to user credentials or deployment tools credentials.
Applications comprise multiple components such as web server and a database server (see this example) or even remote services delivered through APIs. Credentials are required to communicate among components or remote services. Such credentials, referred to as application credentials and keys, must also be stored in key-management systems.
Thus, a key management system will store credentials for deployment automation as well as inter-component communication. Such credentials will likely vary across development, stage, and production pipelines.

These keys may also need to be periodically rotated, which places additional burden on the developer. Having a service which will automatically [re]-generate the keys and configure application or its components with newly generated keys can significantly help reduce the key management burden on the developer.

SECURE AUTO SCALING. Applications deployed in cloud need to scale as load increases. Ideally, this scaling is done in an automatic manner. Such scaling implies that portions of incoming traffic will be routed to a newly spun application component responsible for handling the traffic. Such new routing must also be secure. That is, any credentials needed must be added to the newly spun or decommissioned component at run-time.

MONITORING must be configured for an application components whether it is a fresh deployment, an upgrade, or auto-scale. Monitoring encompasses traditional metrics such as CPU, memory, disk, and network; application-specific metrics, and logs. Monitoring can be passive or active.

As part of monitoring configuration, malware and anti-virus may also be configured.

APPLICATION NETWORK SCANS and AUTOMATED PENETRATION TESTING. When an application is deployed or updated, appropriate network scans and penetration tests must be performed on it before exposing it to the general users. Typically, network scans make use of tools such as Nessus, while penetration tests are typically done through a manual process. These scans and tests can be automatically invoked upon a new deploy or commit in dev/staging/production.

With some aid from developer, the penetration tests can also be automatically performed, contributing to an automated secure devops pipeline.

ENCRYPTED / INTEGRITY STORAGE. Applications deployed in cloud may require the underlying storage to be encrypted or provide guarantees against tampering. A cloud may provide encrypted and integrity storage as part of its offerings. Application developer may configure the use of encrypted and integrity storage in an application pattern or otherwise.

DevOps personnel may specify the use of encrypted storage as part of application patterns. DevOps may bring their own keys or have the cloud auto-generate the keys for encrypting storage. Both types of keys need to be managed, similar to user and API keys.

NETWORK AND APPLICATION FIREWALLS / IDS AND APPLIANCES may also need to be deployed to meet regulatory compliance and good security practice.

In application pattern templates, a developer may indicate the use of compliance regime. The cloud can then automatically deploy and configure network and firewall appliances on behalf of the user.

Admittedly, deploying and configuring network and application firewalls is a "black art". It is often very difficult to get it right due to myriad configurations.

By following the "secure by default principle" and converting the most commonly used aspects of these appliances into functions delivered through APIs, the deployment and configuration of these devices can be integrated into a secure devops pipeline.

TESTING requires various components of an application, and the application as a whole to be continuously tested. The tests, unit, functional, or integration, must be written by developers and their invocation should automatically be done as part of build or deploy phases.

Secure DevOps - Run Phase

Run phase invokes certain capabilities of develop and build, and deploy and test phases in a continuous manner. These capabilities, explained earlier, include:
Following additional capabilities are needed in the run phase.

APPLICATION / CLOUD CONFIGURATION VALIDATION. Once applications are deployed in cloud, the cloud configuration also needs to be validated. Cloud configurations encompass configurations of various cloud-based services such as firewalls, encrypted storage, key lengths, security groups, geographical distribution and so on. Thus, as part of running the application on cloud, both application and cloud configurations need to be validated together.

SCANNING FOR SENSITIVE INFORMATION IN LOGS. Sensitive information in logs such as personal health information (PHI) or social security numbers need to be scrubbed from application logs that may otherwise be viewed for debugging or administrative purposes. If best practices were followed for development, such information would not have ended in logs in the first place. Nevertheless, the scanning service may tag sensitive pieces of information in logs, which may need to be scrubbed or removed all together.

EVENT LOGGING AND MONITORING that was configured in the deploy and test phase must be monitored for any application events, malfunction, or incidents.

ENCRYPTION OF COMMUNICATION. The communication from end-users of an application of among various components of an application must be appropriately encrypted, meeting the security and compliance guidelines. This communication must be periodically monitored, especially along configuration changes, to ensure that communication that was encrypted using say high strength ciphers has not been downgrade to use a low-strength cipher as a result of an update.

If secure devops pipeline was completely followed, all application component communication will likely be encrypted or in isolated networks. The setup of encryption can be done as part of specification in application pattern.

INTEGRITY MONITORING. The integrity of data stored by an application must be monitored through APIs. Any update to the data must be recorded and be alertable so that appropriate actions can be taken.

OPERATIONS of service are fully automated, and changes are logged. These changes include bring up or tear down of services, logging for administrative purposes and so on.

Secure DevOps - Decommission

Eventually, part of an application component or an entire application may need to be decommissioned. Partial decommission may happen, for example, if the load on application decreases. Full decommission may happen if an application is no longer needed. As part of decommissioning, scrubbing of resources may be needed. These resources include:


Among virtual resource decommissioning, any keys that may have been used but are not longer needed must be appropriately deleted.

Secure DevOps - Policy and Verification That Puts it All Together

Since security features are delivered via an API as part of secure devops pipeline, there is a need for having a policy in place that checks for violation of security features, and a verifier engine that validates the results of various features along the secure devops pipeline.

In summary, delivering all security functions through an API, and making them readily consumable by developer is not easy. The ingredients that will make a secure devops pipeline possible are key management (user, automation, application), scanning (code, package, configuration, network, malware, virus), testing (penetration and functionality), patterns, logging (API, access), and authn/authz.

Saturday, July 16, 2016

Can a Cloud Help Developers "Securely" Deploy, Run, and Manage their Applications?

Can a cloud help developers deploy applications securely in a cloud and manage their steady state security? In this three part series, I will discuss this question in detail.

This part covers the challenges a developer faces in deploying applications securely, and managing their steady-state security.

Challenges in Securely Deploying Applications and Managing Their Steady-State Security

Typically, cloud is associated with scale. If an entity desires to scale their application to serve hundreds of thousands of users, cloud is the answer.

Modern applications, deployed on cloud or else, are complex. The comprise multiple components, which interact with each other over network. Each component may or may not hold state, and typically delivers a unit of functionality (micro-services etc).

In cloud, a developer writes automation code to deployed and upgrade these applications, and may leverage cloud monitoring tools to manage the application. 

Security and compliance requires a developer to deploy applications according to the acceptable best security practice of that time. Conforming to these best practices requires substantial security knowledge and expertise on part of the developer.

As an example, consider a simple application comprising a web server and a database server as shown in figure below.

To ensure the security and compliance of this application, a developer has to ensure a number of steps:

  1. Best Practices for Source code. The source code should conform to well-known security best practices. As an example, any SQL queries issued by the web server must not be prone to SQL-injection attacks.
  2. Message Confidentiality. The communication between users of this simple application and the web server, as well as web server and database server must be encrypted.
    Implementing this functionality typically requires knowledge of appropriate TLS protocols and ciphers, creation and setting of PKI certificates and encryption keys, and configuring them appropriately. While services such as LetsEncrypt make it easy to generate certificates, it is still up to the developer to configure her applications correctly with them. Moreover, these certificates need to be periodically rotated.
  3. Managing Application Keys, Certificates, and Passwords. Establishing 'trust' among various components of an application is done through keys, certificates, and passwords. While we typically associate passwords and certificates with end users, the components of an application must establish authentication and authorization among themselves to establish trust. The burden for creating, storing, and managing these credentials is on the developer.
  4. Encryption of Data at Rest. Compliance regimes such as PCI and HIPAA may require the data at rest to be encrypted. A developer has to configure database servers to encrypt the data per record or per disk, and manage the keys associated with them.
  5. Configuring Security Groups. A common feature of IaaS clouds is that they allow incoming traffic to an instance be limited to certain open ports on the instance. It is non-trivial to configure security groups correctly for an application comprising components running in tens of instances or more, due to complex interactions, high availability etc. 
  6. Collection of Logs, and Scrubbing of Sensitive Information in Logs. The logs generated by multiple application components needs to be collected to meet security and compliance as well as operational needs. The developer needs to ensure that no sensitive information such as passwords of the users of its service are present in the logs.
  7. Setting Up Intrusion Detection Systems (IDS) and Firewalls. A developer may need to setup intrusion detection and prevention systems to log any suspicious activity as well as any other firewalls. Typically, in IaaS clouds, security groups double as firewall, so a separate firewall deployment may not be needed.
  8. Admin Access to Web and Database Servers. The developer or the operator of this simple application two tier needs to manage credentials for admin access to web and database servers.
  9. Network Scan of Web and Database Servers. While a developer may have undertaken necessary steps to configure and deploy her application securely, it is still prudent to perform network scans before and after deployment, to ensure conformance to the intended model. These network scans can be as simple as port scans, or can also be application specific.
  10. Penetration Testing of Application Components. Similar to network scans, penetration tests actively try to break the application as may be attempted in wild. Such tests are customized for the application (web server in the above example).
  11. Secure Scaling of Application. When scaling web server or database server, the applications keys, certificates, and passwords need to be setup correctly and stored appropriately. Even in a simple application such as the one shown above, it is non-trivial to configure keys and certificates on load balancers or other components that are required for application to scale.
As it can be seen from this example, it is a non-trivial task developers to deploy and configure their applications securely and requires significant expertise on developer's part.

In the next post, I will describe the concept of a "Secure DevOps Pipeline", which can facilitate a developer in deploying and managing her applications in a secure manner.

Sunday, June 19, 2016

In the company of giants - at IEEE Annual Award Ceremony

Who will not get inspired, sitting in the company of giants, and marveling at the accomplishments of all highly accomplished individuals who received their awards at the IEEE Annual Award ceremony, held at Gotham Hall, NY. I was able to attend, thanks to an invitation by Marconi Society.  One award in particular resonated with me at a personal level, which was given to Charles Mistretta. He was awarded IEEE Medal for Innovations in Healthcare Technology for the development of techniques that have drastically changed how imaging for cardiovascular diagnosis is done. Were it not for him, and other individuals like him, my father's diagnosis for cardiovascular disease in 1980 may not have been possible, and perhaps I would not have be sitting in the ceremony yesterday.

Thursday, June 16, 2016

SPEC Cloud IaaS 2016 Benchmark

I was fortunate to lead the design and implementation of SPEC Cloud IaaS 2016 - the first industry standard benchmark to measure and compare the performance of IaaS Clouds. Read more on my IBM blog entry.

Saturday, April 18, 2015

Eclipse Luna on Mac OS X Mavericks

Setting up Eclipse Luna on Mac OS Mavericks is not straightforward. I ran into some issues and had to search bunch of posts to solve the problems.

Here are the steps for getting Eclipse Luna working correctly on Mac OS X Mavericks.

  1. Install Java SDK. If you just download JRE, you will run into problems. Eclipse Luna requires java version 7 or higher.
  2. Open a terminal and test that your version of java is correct:
    $ java -version
    java version "1.7.0_79"
    Java(TM) SE Runtime Environment (build 1.7.0_79-b15)
    Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)

  3. Download Eclipse Luna
  4. Copy Eclipse to your Application folder. If you already have an eclipse folder there, make a back up copy.
  5. Open Eclipse by double clicking on the Eclipse icon. If you get an error similar to the one below, go to next step
    eclipse The JVM shared library "/Library/Internet Plug-Ins/JavaAppletPlugin.plugin/Contents/Home/bin/../lib/server/libjvm.dylib" does not contain the JNI_CreateJavaVM symbol.
  6. Open the following file
    $ vi /Applications/eclipse/Eclipse.app/Contents/Info.plist

    search for Eclipse

    Add the following string:

    You can check jdk folder path as follows:
    $ ls /Library/Java/JavaVirtualMachines/ | grep jdk

Sunday, July 29, 2012

Cloud SLAs: Present and Future

If your company leverages cloud computing, you should take a look at my recent article, titled, "Cloud SLAs: Present and Future", that appeared in the July 2012 issue of ACM Operating System Review. In this article, I describe SLAs of major cloud providers such as Amazon, Rackspace, Microsoft Azure, Terremark, and Storm on Demand. Key findings are that none of the providers guarantee performance and instead provide some notion of availability. However, the notion of availability varies across cloud providers. EC2, a popular cloud, only guarantees availability on a data center instead of a VM! When you are building an application, it may serve you well to read the SLAs of cloud providers.

I make the job easier for you :). Here is a link to my paper and presentation. I also provide guidance on how SLAs should be designed for future cloud services.