Creating a Minimum Viable Cloud
Working with VMDB, CTP built a fully automated
deployment of an MVC on AWS, addressing core
infrastructure, security, operations and automation
components. The MVC passed multiple third party
security audits to establish the AWS platform as a
secure environment for future development.
The foundational services in the MVC addressed key
concerns common to all applications for highly regu-
lated, enterprise clients, including security, compli-
ance, availability, manageability, transparency and
scalability, similar to the build-out of a new data cen-
ter. The MVC implementation goes beyond those typ-
ical data center capabilities to include “Defense in
Depth” level security, extensive security and opera-
tions logging and monitoring, full stack provisioning
and configuration automation, cost management,
and creation of a new cloud-aligned operating model
designed to manage a highly dynamic environment.
The new cloud platform is based on CTP, AWS and
industry best practices to maximize the agility, secu-
rity and manageability of the cloud deployment. In
addition to leveraging the extensive portfolio of AWS
services, CTP deployed technology from Trend
Micro, HashiCorp, Sumo Logic, Dome9, Chef, Jenkins,
Atlassian, JFrog and a variety of open-source tools.
Building Analytics Proofs of Concepts
Once the first set of approvals were secured, VMDB
engaged CTP to work jointly with its client analytics
team to create five separate PoCs related to their big
data environment. The bank set out to assess an array
of available analytics tools and models to demon-
strate the value of a data-driven banking platform.
The VMDB-CTP team pushed through the PoCs, val-
idating the efficacy of the overarching analytics func-
tion and four separate underlying use cases.
The most pressing PoC involved the ability to process
advanced text-based analytics. VMDB needed to
prove it could build machine-learning models, so it
developed a “topic model” based on data stored in an
Amazon S3. CTP ingested the data, and then used a
Latent Dirichlet Allocation (LDA) statistical model to
show how documents can be categorized into differ-
ent topics.
8 | THE DOPPLER | SPRING 2018
The second PoC examined the use of graph data-
bases. For instance, if the bank wants to mine data
based on fraud, it can build a graph database, gener-
ate graphs of customer transactions and look for pat-
terns that are fraudulent. As customer transactions
reach high volumes, this kind of analsis is difficult to
do in a traditional database. CTP tested the system at
one million users, using a graph database manage-
ment system.
In parallel, CTP tested out scenarios comparing the
bank’s ability to migrate part of an existing data
warehouse onto the cloud vs. building it from scratch.
It took four weeks to shift the data onto the cloud – a
quarter of the time it would have taken to build in a
traditional environment.
Finally, CTP tested the bank’s ability to control the
flow of data and metadata. With all the regulatory
requirements the bank faces – including new regula-
tions