Originally published by CIO
An innovative initiative is only as successful as it is secure. Here’s how CIOs are balancing risk-taking with risk aversion.
At first glance, the deployment of cybersecurity measures and the pursuit of innovation might seem mutually exclusive. Strategies to enhance security are aimed at reducing risk, whereas innovation efforts require being open to taking risks.
But enterprises are finding ways to launch innovative new digital business initiatives while taking steps to protect data and other IT assets. By doing so, they’re establishing pathways to new revenue, improved customer experience, and new market opportunities, even as they tighten security requirements, protect systems and data and keep compliant with regulations.
After all, this is the formula for success in today’s business environment: Push transformative initiatives that embrace innovative technologies such as the cloud, mobile tech, artificial intelligence, data analytics, and the internet of things (IoT) in a way that ensures the security of valuable systems and data.
To do so, today’s enterprises need to find ways to strike a balance between getting out in front of the competition, experimenting with new technologies, bringing proofs of concept to production, and so on, with the more risk-averse practice of ensuring these initiatives are secure.
In some cases, this might involve increasing budget and resources for the security of all systems; in others, it might mean setting aside budget and resources to secure pure innovation plays. Either way, the goal is to be innovative but in a safe, sensible way.
Here are some examples of how companies are attempting to balance innovation and security, either for specific projects or as a general practice.
Deploying new online services while keeping data safe
Higher education institutions need to be concerned about compliance with the Family Educational Rights and Privacy Act (FERPA), a law intended to protect the privacy of student data.
“Although compliance has long been a priority, traditional on-premises student information systems and data [were] generally locked down in those systems and files, with little concern about the outside world gaining access,” says Bill Balint, CIO at Indiana University of Pennsylvania (IUP).
“But with the advent of web-based access to secured systems, the prospect of large-scale security breaches and data exposures made FERPA compliance a much higher priority,” Balint says.
The issue has escalated, as higher education has been transformed into much more of a business-like operation, Balint says. “Institutions, striving to hit enrollment and student success targets, have increasingly turned to cloud-based, rapid implementation services in areas such as customer relationship management and data analytics solutions,” he says.
Such technology, which IUP is accessing via cloud-based subscription services, allows the university to offer innovative services such as helping to create optimized, personalized financial aid packages and customized academic analysis that can help attract and then retain successful students.
“But in order to do it, the vendors typically also require [that] loads of sensitive academic and/or financial data about the student be imported into vendor-controlled cloud applications,” Balint says. “Such activities take some security control away from the institution. Instead, the institution must often ‘take the contractual word’ of the vendor that the sensitive data will be secured both in transit and at rest.”
To ensure that sensitive data is not exposed, the first step IUP took was to consider the security and privacy implications of the cloud-based services and share only data that is core to a tool’s functionality.
“For example, it would be important to share grade information with a vendor whose tool is focused on academic success,” Balint says. “But social security numbers provide no value, and should not be shared.”
Beyond that, IUP requires that all the cloud vendors that it is working with meet industry standards for data privacy and security, via a formal contract and service-level agreements (SLAs).
“Very few higher education institutions can obtain the in-house expertise to perform the functions of these providers, but the services they provide are increasingly critical to the viability of many institutions,” Balint says. “The industry must continue to evolve best practices that protect sensitive and confidential data.”
Building security into a new mobile app
In late 2019, the State of Colorado announced the launch of Colorado Digital ID in its myColorado mobile app to transform the way residents interact with state government. The vision of myColorado is to provide state residents with an innovative, secure, and convenient mobile solution through which they can connect with digital identity and government services.
“Our goal is to make it easier for residents to conduct business with the state — such as renewing their driver license — by connecting them to services through a central mobile platform,” says Deborah Blyth, CISO for the State of Colorado. “This eliminates the need to visit a state office, thereby reducing time and transportation costs for residents and ultimately helping achieve customer delight.”
Since the public launch in October, more than 30,000 residents have downloaded the myColorado app.
The state government realizes that gaining and maintaining public trust is paramount to ensuring widespread adoption of any product or service offered to residents, Blyth says, and this is best accomplished by ensuring that security is a critical component in application development.
Personal information in myColorado is protected by multi-factor authentication and data encryption for privacy and security throughout the app. In addition, myColorado employs user authentication, validation, and federation on several levels to ensure the identity of the user, Blyth says.
“Since the time myColorado was merely an idea, a security architect played an integral part on the app design team,” Blyth says. “From the project’s inception, there was a need to validate the identity of the mobile user to match that user to the appropriate information contained within the state systems.”
Other considerations included ensuring ongoing access to the user’s information with appropriate authentication to prevent unauthorized access, and evaluating and choosing a payment provider to process payments securely.
The development team performed testing to ensure that the mobile app and back-end servers were free from vulnerabilities that could be exploited and therefore lead to sensitive data being exposed. Additional precautions were also taken during the development process to prevent developers from accessing sensitive data, Blyth says.
A key factor in myColorado’s successful deployment of security features was that all security requirements were agreed upon and incorporated into the app through an iterative process as it was being designed and built, Blyth says.
“Having a security architect as an active and equal participant of the innovation team ensured that important security criteria were built in from the beginning, rather than simply viewed as a check-box at the end of the development cycle,” she says.
Taking an experimental approach to IT innovation
O.C. Tanner, which offers employee recognition and reward services to customers, is launching projects leveraging newer technologies or methodologies such as AI, 3D printing, and DevOps. In doing so, it follows a number of practices to ensure that data and systems are secure and that privacy is maintained, while at the same time not stifling innovation.
One of the most important is to treat new IT initiatives as guarded, scientific experiments. O.C. Tanner conducts technology trials that are small and that use the company’s existing processes and tools, and it isolates these endeavors from entities outside the organization.
“If one of our experiments has or creates a vulnerability, our existing processes should find the vulnerability,” says Niel Nickolaisen, senior vice president and CIO. “But if not, the vulnerability does not put the rest of our world at risk.”
Sometimes a vulnerability might cause the company to cancel the experiment or find ways to remediate or bypass the issue. “In one case we were experimenting with a new technology, discovered some issues and then worked with the [startup] provider to resolve the issues,” Nickolaisen says.
As the experiments pass certain proof points, which vary by the type of technology and experiment, “our standards for the production-worthiness of the experiment scale, and so the requirements become more stringent,” Nickolaisen says. “Before anything is released into our production environments, it must meet our standards — and those standards include security [and] privacy.”
In one example, O.C. Tanner was convinced that it had rich enough data that it could provide clients with insights into some of the causes
of employee turnover. To prove this, it needed to use client employee data, which it had to keep secure, in order to build cloud-based AI/machine learning algorithms.
“To start small, we anonymized a subset of our client data and did initial proofs of concept in the cloud service,” Nickolaisen says. “The results were encouraging enough that we felt we should move forward. But, at some point we would need to use actual, not anonymized, data.”
As O.C. Tanner scaled the experiment it also evaluated the security and privacy processes of the available cloud AI and machine learning services. “In parallel, we worked with our clients so that they could participate in our assessment of the security/privacy practices of the cloud providers,” Nickolaisen says. “We needed them to be as comfortable as we were with our choices.”
Another example involves the DevOps process and tooling the company is using. To ensure that the DevOps process met its security standards but still allowed for rapid deployments, O.C. Tanner wanted to create some type of automation so creators of new services and functionality could self-deploy only changes that were pre-approved.
“This required functionality or a tool that we did not have,” Nickolaisen says. O.C. Tanner found such a tool, but it was from an early stage startup and therefore presented some risk. “We did an experiment with their tool to assess their functionality,” he says. “When that experiment was successful, we started to apply our production-worthiness standards and identified some security and certification gaps in their product.”
O.C. Tanner then worked with the company to resolve the issues before moving into production.
Prioritizing customer experience — and data protection
Worldwide Assurance for Employees of Public Agencies (WAEPA), a provider of group term life insurance, has a goal of surpassing the competition in serving civilian federal employees and retirees, by exceeding their expectations.
“As services and digital tools evolve, WAEPA realizes the need to transform and elevate every platform and touchpoint in the user experience,” says Brandon Jones, CIO.
Having a strong digital presence is foundational to meeting this vision, Jones says. To enhance its online presence, the organization first conducted a usability study, analyzing the current state of clients’
digital experience, conducting usability tests and user interviews, and synthesizing the data to identify trends, patterns, and commonalities across the information collected.
The research and analysis uncovered opportunities for usability improvements, allowing WAEPA to generate a set of findings and recommendations.
Next, WAEPA launched a “customer journey mapping effort” to
identify the various stages clients and prospects move through during a transaction, what they are expecting at each step, what questions they have at each step, and what they are feeling at each step.
“This exercise allowed us to identify the steps our members take to engage with our products and services, as well as the links to other parts of their journeys, opportunities for improvement, and areas where steps should be combined or split,” Jones says. “By methodically plotting our members’ steps, we were able to use this exercise as a diagnostic tool.”
The WAEPA began building a new website and member portal leveraging findings from the usability study and customer journey mapping. The goals for the new site were to better educate users about products and the application process; improve consistency and ease of use across the site; provide self-service tools and information so users can make informed decisions; and to “humanize” the experience to help, guide, and reassure online users.
Throughout the effort the protection of client data was a paramount consideration, and security was baked into the new website and supporting infrastructure. The security strategy includes the use of tools such as redundant firewalls, a virtual private network (VPN), protection against spam and phishing, and identity and access management.
By taking these and other steps, WAEPA was able to create a better customer experience for its clients and at the same time deliver a high level of security.