The New World of Database Technologies

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

Oracle

PAGE 42
MIGRATING TO CLOUD
REQUIRES A 3- PRONGED
ONE COMPLETE MARKETING PROGRAM
STRATEGY FOR SUCCESS

IBM
PAGE 44

THE NEW
MOVING A RELATIONAL
DATABASE FROM GROUND
TO CLOUD

WORLD OF
IDERA
PAGE 46
THE EVOLUTION OF

DATABASE
ENTERPRISE DATA
ARCHITECTURE

TmaxSoft

TECHNOLOGIES
PAGE 47
DATABASE DILEMMA:
BEST PRACTICES FOR
AVAILABLE DATABASE
OPTIONS

Influx Data
PAGE 48
TIME SERIES: THE NEXT
REVOLUTION OF DATABASE
TECHNOLOGIES

Couchbase
PAGE 49
COUCHBASE DATA
PLATFORM FOR
EXCEPTIONAL
ENGAGEMENT IN RETAIL

Best Practices Series


40 JUNE/JULY 2017 | DBTA

Nine Reasons to be
Optimistic About
TODAYS DATABASE
TECHNOLOGY CHOICES
Best Practices Series

Theres no question that todays data also referred to as database as a service process workloads up to 100 times faster
environments are complex entanglements (DBaaS), can be SQL or NoSQL, open than disk-to-memory configurations.
that are growing uncontrollably. The days source or relational, and allow custom- This, in turn, enables business at the
of laboring over database servers, writing ers to choose the environment that is speed of thought, as queries are deliv-
scripts, and troubleshooting performance best suited for their particular use cases, ered at blazing fast speeds. Along with
snags may be fading, as things move toward whether that is support for streaming faster processing, storage is also moving
more automated approaches that are data pipelines or a data warehousing for into new modes. There has been a rise
boosted by intelligent solutions and lever- analytics. Flexible scaling and the ability in new types of storage, such as flash
age an abundance of services from cloud to offload time-consuming database storage, and solid-state disks. In addi-
providers and open source communities. administration tasks to the cloud vendor tion, many organizations are moving to
At the same time, existing data infra- are additional advantages. This means cloud-based storage now, as business
structures and solutions can be supported that what used to take weeks or even and IT leaders grow more comfortable
and migrated in an incremental way months to set up now can be done within withand even begin to depend on
to the new world of data technologies, a matter of minutes by spinning up the security that cloud vendors are able
reducing the risks inherent in moving cloud instances. As applications expand to provide. These new storage formats
and retraining staff with new solutions. and grow, databases can expand and enable data to be maintained without
The key to all this is building a great deal grow with them, as well as be downsized the latency seen with disk drives.
of flexibility into approaches to data. accordingly. And, data environments
Here are the nine key areas where accessed incorporate all the latest security, 3. W
 ERE BURYING MORE OF THE
data environments are changing: performance, and upgraded features. As COMPLEXITY, AS THINGS GET
a result, database teams can spend their MORE COMPLEX.
1. UNDERNEATH IT ALL, time delivering capabilities to the busi- Lets face itdata environments have
THINGS ARE GETTING VERY ness in terms of analytics and transaction become increasingly complex, and big
SERVICE-ORIENTED. support versus being mired in building data isnt making the job any easier. As
There are two ways data environments and tuning databases. a result, many data managers are turn-
can be supportedvia cloud or onsite ing to a range of solutions, including
hardware. The advantage with cloud is 2. A
 LSO UNDERNEATH IT ALL, THINGS automation, as well as appliances that
that data managers, developers, or testers ARE GETTING A LOT FASTER. essentially serve as data warehouses or
who need a database environment can Rapid processing of big data is being security infrastructures in a box. Auto-
provision one almost instantaneously, enabled by frameworks such as Hadoop mation is also making its mark, with a
without going through their respective or Spark. In-memory technology has large range of products that eliminate
IT, finance, or top managers or needing also become a ubiquitous part of the manual processes, particularly when
to acquire new servers or storage arrays data world, now available through most it comes to troubleshooting issues or
to support their efforts. Cloud databases, major database vendors. In-memory can distributing workloads.
JUNE/JULY 2017 | DBTA 41

4. THERE HAVE NEVER BEEN SO or desire to run and maintain data to provide personalized customer expe-
MANY DATABASE CHOICES. environmentsyet, the ability to do so riences based on real-time information.
The selection of databases is vast, and is critical in todays data-driven economy. Many legacy systems may rely on day-old
choices depend upon needs seen within Even units within larger organizations data from ETL processes, but a new gen-
enterprises. Those enterprise teams find it difficult to justify adding more eration of tools and platformsincluding
seeking rapid development and fast capabilities to their data operations. In in-memory technology, flash storage, and
access to unstructured data sources may these cases, DBaaS is available through cloudare easing the transition to real-
find the NoSQL databases suited to their cloud sites, as well as provided by the time in a cost-effective way.
requirements. Even for structured, rela- major database vendors. Enterprise search
tional data, there are open source and is another form of deployment that may 9. E
 VENTUALLY, DATA MANAGE-
cloud database offerings that promise reshape the nature of database access. MENT AND STORAGE MAY BE
much of the basic functionality formerly THE RESPONSIBILITY OF DATA
only seen within major relational data- 7. FOR END USERS, DATA HAS BEEN USERS THEMSELVES.
base management systems. COMING ALIVE, AND HAS NEVER The future of databases may not be
BEEN SO BRILLIANT. in self-contained databases as we know
5. DATABASE DEVELOPERS For years, static reports and dash- them today, but perhaps in some vast, dis-
HAVE MANY MORE PLATFORM boards sufficed. But as organizations seek tributed uber-database that extends to all
AND TOOL CHOICES AND to compete on analytics, theres a need for corners of cyberspace. Thats the thinking
HAVE A LOT LESS GRUNT-LEVEL more sophisticated front-end interfaces behind blockchain technology, which
CODING TO DO. that tell the story with more clarity. To shifts many database services and func-
Database developers have a wide meet these requirements, there are power- tions to an independently maintained
array of resources available from open ful data visualization tools now available global ledger that is seen as secure due
source communities, cloud providers, at the front end. With the ability to shape to its redundancy and immutable nature.
and tool vendors. The open source and visualize data trends, business users This represents a significant transition
communities have provided a num- can spot anomalies within seconds, as from the way data has been tradition-
ber of robust, well-tested solutions, well as be able to map trends affecting ally organized within enterprises. For
ranging from databases themselves to their organizations and customers. instance, control over the data shifts from
frameworks and development lan- enterprise overseerssuch as database
guages. Notably, there has been a rise 8. LATENCY IS SO 2010: REAL-TIME administrators or IT executivesto the
of languages that directly support data OPTIONS ARE INCREASING. data owners or creators, administered by
analysis, such as R or Python, that are Real-time data flow or data stream- a global network with no owners. Ulti-
realigning data operations. ing is becoming a part of corporate data mately, this provides enterprise users with
strategies. As industries are disrupted by greater control over the information that
6. DATA IS GETTING MORE real-time, data-savvy startups, companies is employed in transactions and services.
ACCESSIBLE, REGARDLESS are increasingly embracing these capa- While blockchain is still in its formative
OF FORMAT OR ORIGIN. bilities themselves. The ability to speed stages, many enterprises are showing
Enterprises are beginning to learn time to market is one of the most critical considerable interest in the approach, and
new ways to open up data to employees forms of competitive advantage required vendors are likely to follow suit, building
and customers. Database-as-a-service in todays economy. Enterprises seek data services and support on top of blockchain
is an example of this trend. Many environments that are blazing fast and can implementations. n
small to medium-sized companies, for respond on a subsecond basis to events. Its 
example, dont have the staff, expertise, a key competitive differentiator to be able  Joe McKendrick
42 JUNE/JULY 2017 | DBTA Sponsored Content

Migrating to Cloud Requires a


3-Pronged Strategy for Success

In the early days, Cloud Computing was heavily leveraged for to Cloud values at your site and through to the Oracle Cloud,
technology experiments, lower cost proof of concepts, early giving you complete choice of how to govern the migration of
stage startups, transient-repeatable batch based workloads and workloads across Public Cloud, Private Cloud and on-prem-
just generally for temporary activities. Today, as Cloud ises environments.
Computing has matured, migrating to the Cloud, which has a
completely different tone of permanency, is an increasingly TEST CLOUD MIGRATION BENEFITS BY STARTING WITH
hot topic of discussion. THE RIGHT KIND OF WORKLOAD
Lines of Business and Information Technology providers The workloads that are moving to Cloud first are the ones
are migrating to Cloud, across small to medium sized business that have an incremental and controlled impact on the people
and at the Enterprise level. The desire for an increased velocity and processes that, in turn, impact the business critical side of
to innovate and execute is driving this movement. Further, operations. Development and test is by far the most prevalent
Cloud provides more efficient, and at times, lower cost activity migrating to Cloud and the workloads associated with
approaches to common activities such as setup and teardown that activity vary from extremely light weight functional
of development and test environments, scale on demand testing to large scale systems tests. Given the variation in
disaster recovery sites and an improved ability to pay for what workload characteristics associated with development and
is used rather than for peak demand forecasting. Surprisingly, test, it provides an opportunity to get value out of using
in a reverse trend from the cautions of early adoption, Cloud Cloud in a number of important ways.
is proving to reduce risk in areas of security and compliance The lightweight workloads associated with functional
as the largest contributing factor to failures in these areas, the testing provide a perfect opportunity to leverage consolida-
human factor, is normalized across thousands of businesses tion of workloads, looking for highly cost effective shared
that leverage the same hardened processes and technologies. infrastructure or containerized deployment targets. The trick
Cloud value can be described at a basic level as automation here is to make sure that you still have effective control of
and self-service of the Data Center. Cloud providers then add data and process isolation with your consolidation approach.
velocity boosters such as Platform Services to help accelerate Technologies found in the Oracle Cloud like Oracles mul-
new development and Software Services that provide immedi- titenant can enable this kind of secure consolidation, allowing
ate and material benefits for operating a modern business. 100s of tenants per compute core instead of giving each
functional tester a large unnecessary slice of compute that just
ORGANIZATIONS LOOK TO A HYBRID APPROACH FOR wastes money. Further, Oracle Cloud integrates agile pro-
FLEXIBILITY AND REDUCED RISK cesses used for development and test like test master database
Migrating to Cloud is not just about technology, it is also creation and lifecycle management into Cloud automation
about processes and people. Choosing a migration approach while providing portability of data and security policies back
that facilitates leveraging the benefits of Cloud while under- and forth with on-premises deployments. Bringing the
standing the impact on processes and the people behind them benefits of consolidation while enabling a modern continuous
can be an important factor in successful Cloud migration. For integration and delivery pipeline for data driven applications
this reason, many businesses are finding value in a Hybrid though tight integrations with a modern developer stack
approach. A well-defined hybrid approach can provide access service.
to Cloud automation and self-service while providing the
flexibility to plan for workload migration in a way that ORCHESTRATION FOR YOUR ENTIRE STACK IS KEY
reduces risk and maximizes value because you can control that On the other hand, the large scale system test aspect of
workload migration in lock step with a change in processes. development requires a Cloud automation capability that can
Oracle is taking a dual pronged approach to delivering a setup and teardown complex architectural topologies quickly,
hybrid Cloud capability for businesses planning a Cloud scaling up and scale out to the level required to support the
migration. First, by providing a basic technology stack eventual production deployment footprint. A key to the
management capability that can see and manage resources quick setup and teardown for predictable and repeatable
uniformly across both on-premises and cloud. Second, Oracle testing is a full stack orchestration capability that can handle
literally brings the Cloud to you, via Oracle Cloud Machines all layers of the application architecture. Declarative orches-
that deliver the same technology stack that runs Oracles tration that can describe a software stack and the required
Public Cloud behind your own data center firewalls. Oracle infrastructure (compute, network and storage) needed to run
Cloud Machines extend the reach of your on-premises systems that stack and that can execute to automatically create all of
Sponsored Content JUNE/JULY 2017 | DBTA 43

the resources and deploy all of the software on demand, run Migration of modern data driven workloads that are
the series of test, teardown the stack to free up the resources combining streaming, unstructured, structured and NoSQL
and/or stop the billable costs. In this process, the most data into intelligent analytical workloads are actually less
challenging aspect is dealing with the database layer as the common as these applications are typically born in the cloud.
storage and access of data is at the heart of most stateful For these modern applications, migration is most commonly
application stacks and dealing with the database means dealing with data movement. A combination of software
dealing with data in transactional workloads where your appliances that integrate on-premises files systems to Cloud
availability and recoverability SLAs come into play. object storage, technologies for continuous real-time data
Oracle Cloud provides orchestration capabilities for the transfer like Golden Gate and brute force techniques like
entire stack, not only providing orchestration of infrastructure physical storage appliance migration services that ship to your
and any custom software required, but also orchestration of data center where data is loaded and then brought to the cloud
Oracle Database best practice Oracle Maximum Availability provider all come together to fulfill the spectrum of data
Architecture. The ability to orchestrate Real Application migration needs.
Clusters (RAC), Data Guard or even advanced platinum SLA Once these new types of data arrive in the Cloud and are
level architectures that combine RAC, Data Guard, Golden put into the specific technologies handling the unique
Gate to provide proven business critical system stacks. As workloads for that data type, the data is not locked into
workloads move into production, high availability and disaster isolation silos making it unusable for future business applica-
recovery characteristics of a maximum availability architec- tions. Oracle Cloud makes the migration and use of these data
ture become essential aspects of migrating to cloud while types easy by providing an open platform of services from Big
retaining the confidence to deliver business critical SLAs. Data to NoSQL to traditional Relational while integrating the
technologies to work together in a modern data management
MINIMIZE DOWNTIME AND RISK WHEN YOU LIFT AND platform service. Oracle Cloud eases migration by handling
SHIFT any data type, any workload, at any scale, then uniquely
Departmental workloads are often a great target for a lift integrating these point solutions with a singular standards
and shift migration to cloud. One can apply the orchestration based access using SQL. Oracle will let your existing SQL
techniques discussed above for application runtime lift and tools have access to the data across the platform, in a single
shift. The sticky part is migrating the existing data into the view of all the data, and allow you to process that data without
Cloud. The good news is that Oracle Database Cloud Services having to move it. This is possible because Oracle extends its
use the same Oracle Database engine as that found on-prem- engineered systems architecture to provide the building blocks
ises, so the same software, architecture and tools used for a Cloud infrastructure that is intelligently powering an end
on-premises can help you with a Cloud migration. Of course, to end data management experience.
the techniques and tools used will depend on factors involving
network bandwidth and storage IO characteristics along with GET READY TO CAPITALIZE ON THE CLOUD
the specific data type and size and the requirements for Migrating to the Cloud can unlock tremendous value for
business continuity for the application services. the business. Planning a path that considers technology,
If you are migrating non critical applications that can take processes and people will provide the most effective return on
extended maintenance windows, basic data movement your effort. Capitalizing on new forms of automation to
techniques such as file copy, export/import, transportable table optimize costs with evolving processes and dynamic processes
spaces (useful for heterogeneous data movement) and tools is only the beginning. Unlocking the power of new modern
that expose those like data pump, RMAN and SQL Developer techniques to leverage all kinds of data in any workload and at
can be applied. However, if you are looking to minimize any scale will bring the competitive advantages necessary to
downtime in a lift and shift of critical applications, you can succeed in the businesses digital transformation to Cloud. n
consider leveraging Oracle Cloud automation to setup a hybrid
disaster recovery site from on-premises to cloud, then perform
a switchover operation to bring the production workload to
the Cloud. Oracle makes it very easy to use tools like RMAN
with Cloud Service automation hooks to easily instantiate Robert Greene is a senior director, product management
Cloud service instances from on-premises database backups at Oracle.
and turn those into replicas for minimal downtime migrations.
44 JUNE/JULY 2017 | DBTA Sponsored Content

Moving a Relational Database


From Ground to Cloud
This article was authored by:
Roger E. Sanders, Db2 Offering Manager, IBM
There is a bias among many database administrators (DBAs) IBM dashDB and IBM dashDB for Transactions, a DBA only
that public clouds are neither capable of meeting the demands has to focus on data object creation and data delivery; the
often placed on relational database management systems nor service provider (in this case IBM) takes care of everything else.
secure enough to house personally identifiable information (PII). Support proof of concept (POC), prototype, and pilot projects.
Because of this, the use of a public cloud for database deploy- Utilizing a private cloud, DBAs can experiment with the latest
ments is an option many DBAs are unwilling to explore. But, after database technologies in a cost-effective environment that
a few years of using the cloud for other applications, companies doesnt require additional data center resources (i.e., floor space,
are finding that the benefits of cloud computingfaster access to power, networking, etc.).
infrastructure, greater scalability, and higher availability top the Mitigate risks for unplanned downtime while keeping
listare living up to expectations set by cloud providers. Conse- infrastructure costs to a minimum.
quently, cloud adoption is growing and the use of a public cloud Using a hosted DBaaS offering such as IBM DB2 on Cloud, a
for relational database deployments can no longer be avoided. DBA can quickly set up a disaster recovery (DR) environment
without the need of a second, off-site data center.
WHY DBAs SHOULD BE CONSIDERING PUBLIC CLOUDS Consolidate smaller on-premises databases.
In addition to reports produced by Gartner, IDC, and With a hosted DBaaS offering, DBAs can consolidate sprawl-
Forrester, one of the things many cloud experts watch for are the ing, on-premises systems without having to change the
results of the State of the Cloud Survey that is conducted each year management processes and tools they are familiar with; with a
by RightScale, an enterprise cloud management company. In managed offering, they can let the cloud provider take over the
January 2017, just over one thousand technical professionals maintenance and management of the databases that were
responded to this survey and results indicate that cloud comput- consolidated so they can focus on other priorities.
ing is being used across a variety of industries, by small and
midsized businesses, as well as large enterprises. PERFORMING A GROUND-TO-CLOUD MIGRATION
To segment and analyze organizations according to their levels When people think of moving a database from one location
of cloud adoption, RightScale uses its own Cloud Maturity to another, movement of data is usually the first thing that
Model, which consists of four distinct stages of cloud maturity: comes to mind. But, a relational database consists of much more
Cloud Watchers: organizations that are developing cloud than just data. There are data objects like schemas, tables,
strategies but have not yet deployed applications in the cloud. indexes, and views, that control how data is stored (and in some
Cloud Beginners: organizations that are new to cloud computing cases how it is organized). There can also be system objects like
and are working on proof-of-concepts or initial cloud projects. buffer pools, storage groups, table spaces, and transaction log
Cloud Explorers: organizations that have multiple projects or files that provide, among other things, data integrity, optimal
applications already deployed in the cloud. performance, and high availability.
Cloud Focused: organizations that are heavily using cloud Many, if not all, of these objects must be taken into consider-
infrastructure and are looking to optimize cloud operations and ation before a ground to cloud migration can occur. Objects
minimize cloud costs. available to an on-premises database may not be supported by the
According to the RightScale 2017 State of the Cloud Report, cloud DBaaS offering desired. This is particularly true for system
although it appears that smaller organizations are more objects that are tied directly to a databases underlying storage. Or,
likely to be Cloud Focused, an almost equal portion of enter- data objects such as user-defined functions and stored procedures
prise respondents are in the two most mature stages of cloud that rely on external libraries to provide functionality.
deployment (as shown in the chart on the following page). The process used to move a database from one location to
This same report indicates that hybrid and public cloud adop- another usually consists of the following steps:
tion is growing, while the use of private clouds is going down. 1. E
 nsure both the source and the target databases are
Based on these findings, its clear that the move to deploy accessible to someone who has the authority to retrieve the
enterprise-level relational databases in a public cloud is inevita- source databases data.
ble. But, what advantages can DBAs expect to gain by this move? 2. G
 enerate the Data Definition Language (DDL) statements
It turns out, there are several. For example, a DBA can utilize a needed to recreate objects found in the source database
public cloud to: with IBM DB2 for Linux, UNIX, and Windows (DB2 LUW),
Support new development and testing initiatives without this can be done with the db2look utility.
having to manage multiple database environments. 3. Create the appropriate objects in the target database, using
With a managed Database-as-a-Service (DBaaS) offering like the DDL produced in the previous step.
Sponsored Content JUNE/JULY 2017 | DBTA 45

Cloud Maturity by Size


1002 Respondents

SMB 7% 14% 20% 21% 38%

Enterprise 4% 14% 24% 30% 28%

Cloud Watchers Cloud Beginners Cloud Explorers Cloud Focused


No Plans Planning First project Apps running Heavy Use

Source: RightScale 2017 State of the Cloud Report

4. Extract data from the source database and store it in a secure And, if sensitive data is to remain in the cloud, some form of
place, temporarily (with DB2 LUW, this can be done with the encryption that is provided by the cloud database offering (not
Export utility or IBM Optim High Performance Unload). the underlying storage infrastructure, which can be vulnerable in
5. Copy data that was extracted from the source to the some situations) should be used to keep the data secure at all
target (with DB2 LUW, this can be done with the Import times. (DB2 Native Encryption, which is implemented within
or Load utility). the DB2 kernel itself, is used to perform this vital task with DB2
However, because of subtle differences that often exist between LUW, DB2 on Cloud, dashDB, and dashDB for Transactions.)
on-premises databases and public cloud DBaaS offerings, the It is important to keep in mind that when data is extracted
DDL generated from the source will most likely need to be from an encrypted database, it is no longer encrypted. Because it
altered before it can be used against the target. For example, is assumed that the individual extracting the data has the
DDL statements for system objects like buffer pools and table authority to do so, this does not present a problem. However,
spaces may have to altered (or removed). And, DDL statements once a file containing un-encrypted data leaves the workstation
for data objects like triggers and materialized query tables (MQTs) it was created on, the data can no longer be considered secure.
may have to be separated and written to a second file so they can Therefore, if sensitive data is to be moved as part of a ground to
be created after the target cloud database has been populated. cloud migration, it is imperative that the individual performing
the migration is the only one allowed to access such data files. If
PROTECTING SENSITIVE DATA DURING A GROUND-TO- these files are transferred to the cloud provider for any reason,
CLOUD MIGRATION the security of the data can no longer be guaranteed. And it goes
Encryption, a process that transforms data into an unintelligible without saying that files containing un-encrypted data should be
form so it cannot be obtained or can only be obtained using a special securely deleted once a ground-to-cloud migration is complete.
decryption process, is the most effective way of protecting sensitive
information that is transmitted through untrusted communication CONCLUSION
channels or stored on some form of electronic media (for example, a Although many DBAs are reluctant to use a public cloud for
disk drive). More often than not, data is encrypted using an industry database deployments, research shows that the movement of
standards-compliant cryptographic algorithm like the Advanced some on-premises relational databases to the cloud is inevitable.
Encryption Standard (AES) cipher, together with a 128- or 256-bit By embracing the public cloud instead of avoiding it, DBAs can
key. Once data is encrypted, the key, together with the algorithm, can off-load the management and maintenance of select database
be used to return it to its original form. systems to cloud providers so they can focus their attention on
When sensitive data like PII is moved from an on-premises mission-critical databases that are at the core of their business.
database to the cloud, Secure Sockets Layer (SSL) technology However, the migration of any on-premises database to a private
sometimes referred to as Transport Layer Security (TLS) technol- cloud takes careful planning. And, it is crucial that data integrity
ogyshould be used whenever possible. SSL is a standard security and security be maintained throughout the migration process. n
protocol that is used to establish an encrypted link between two
communicating computer applications; it provides both data IBM
encryption and data integrity. www.ibm.com
46 JUNE/JULY 2017 | DBTA Sponsored Content

The Evolution of Enterprise


Data Architecture
Data is proliferating at an accelerated support for native reverse-engineering, the data architecture becomes the
rate, with all the mobile and desktop plus a special modeling notation for foundation for the enterprises data
apps, social media, online purchasing, the embedded object arrays used in strategy. To effectively leverage corporate
and consumer loyalty programs available MongoDB. This is-contained-in data as an asset, business and IT teams
today. These data sources have not only notation indicates the array relationship must work together on the data strategy.
changed the way we operate on a day-to- for nested objects in the model view. While business leaders must drive the
day basis, but also significantly increased Additionally, Data Architect can data culture in the organization, data
the volume, velocity, and variety of data forward-engineer a MongoDB model to architects serve as the champions for
being created. blank-sample JSON, providing round- data value and data quality, ensuring that
Faced with this growing trend, data trip support. The MongoDB sources can everyone understands what the data is
professionals now have to look beyond be accessed securely through Kerberos and the value it brings to the business.
the relational database to NoSQL and SSL authentication and encryption. ER/Studio is IDERAs flagship data
database technologies to fully address Other applications for big data often architecture solution that combines
their data management needs. The use Apache Hadoop as the underlying data, business process, and application
four categories of NoSQL databases are data management platform. Due to modeling in a multi-platform
column-oriented, key-value, graph, and Hadoops schema-less architecture and environment. With ER/Studio, data
document-oriented databases, and each its ability to horizontally scale over modelers and architects can handle
one is best suited to fill a specific data commodity hardware, data storage is no larger, more complex data models and
management niche. longer the limiting factor for performing manage numerous stakeholder groups
Document-oriented databases, which in-depth analytics. Hadoop gives data that require access to information assets.
store data in JavaScript Object Notation professionals the ability to keep up with Business stakeholders can participate in
(JSON) documents, are commonly the volume, velocity, and variety of data the data definition process, contribute
used to collect machine-generated data. being created. However, its schema-less to the glossary metadata and terms,
This data is often generated rapidly and architecture does not mean that data and access information on models and
in high volume, and therefore is often modeling can be avoided. metadata at the appropriate level for
difficult to manage and analyze with Since data stored in Hadoop is almost reporting and analysis. All of this can
systems other than document-oriented always used for analytics, and because be accomplished in a flexible and agile
databases. of its immense volume and wide variety, workflow environment that responds to
MongoDB is one example of a the need for data modeling has never evolving business needs.
document-oriented database and, been greater. As an example, if a micro- The ER/Studio suite expands
although it excels at solving the targeting strategy relies on personal data visibility across the enterprise data
problem outlined above, its schema- captured from a campaign website, then landscape for improved alignment
less architecture can pose problems a dimensional data model should be between business and IT. With support
related to data quality. For instance, used to dissect the data into facts such as for numerous relational and big data
in a single MongoDB collection, you website visits by potential supporters and platforms including Oracle, SQL Server,
could technically have different fields dimensions such as location and time of Sybase, Teradata, PostgreSQL, Hadoop
for accountNumber, AccountNumber, their visits. This is necessary information Hive, and MongoDB, IDERA provides
and accountnumber (note the different for properly analyzing the data. the most comprehensive data architecture
case usage). The data quality problem ER/Studio Data Architect solution. Discover the benefits of
arises when a report is written that provides support for Hadoop Hive ER/Studio at www.idera.com. n
requires customer account numbers, but with native reverse-engineering and
only queries the AccountNumber field DDL generation, and is compatible
and, therefore, only receives a subset of with many of the major third-party
records. To steer clear of this problem, implementations such as Hortonworks. IDERA
it is recommended to spend some time The reverse-engineering process www.idera.com
modeling your data before implementing generates datatypes and storage
a MongoDB system. properties specific to Hadoop Hive.
ER/Studio Data Architect makes Once all of the data sources are
modeling MongoDB easier with integrated into the modeling tool,
Sponsored Content JUNE/JULY 2017 | DBTA 47

Database Dilemma: Best


Practices for Available
Database Options
With all the available database options, And, as noted above, if an organiza- VIRTUALIZATION
it can be hard to determine which one tion is subject to local data sovereignty With virtualization, patching can
is right for an enterprise. The three key and security laws, it can mitigate some sometimes be an issue; each OS sits on
issues most central to an organizations of their restraints with a hybrid cloud; top of a hypervisor and IT may have to
database needs are performance, secu- some data can stay local and some can go patch each VM separately in each piece
rity and complianceso what are best into the cloud. of hardware.
practices for each deployment option to Its best to plan for a higher initial
manage those priorities? PUBLIC CLOUD Capex, because the cost of installing a
A public cloud is Opex friendly, but it can database needs to be accounted for. An
ON-PREMISE/PRIVATE CLOUD get expensive after the first 36 months. It enterprise can opt for an open-source
Examine expected ROI. Traditionally, the is crucial to manage cost, so keep ROI in solution like KVM, but this solu-
break-even/ROI for on-premise deploy- mind when deploying a workload via the tion often requires additional set-up
ment is between 24 and 36 months; public cloud. expenses. In addition, its a strategic
this is a guide and may not work for all More importantly, latency issues can imperative to have a fault-proof disaster
organizations. occur depending on how the public cloud recovery plan because the enterprise
Where a customer is located relative has been architected and how applica- itself is the single point of failure; if
to where the data is located can impact tions and infrastructure are deployed. In hardware fails, VMs go down.
legacy applications. If users located either case, the user experience can be Also keep in mind that network
in one part of the globe are accessing greatly affected. To improve performance, traffic issues may arise due to multiple
the database in another, there can be distributing apps and data close to a user applications accessing the same network
problems with applications like music base is a better solution than the tradi- card. This is just one reason why an
streaming. Network constraints must be tional approach where everything is in enterprise server must be purpose built
examined and mitigated. Security mea- one data zone. for the virtualized environment.
sures as well as disaster recovery must be Security with a public cloud is Finally, examine older hardware and
architected into a solution as well. always a challenge, but can be mitigated see if it can be repurposed and, if pos-
A key consideration with this option with proper measures such as at-rest sible, consolidate multiple applications
are compliance issues for organiza- encryption and well-thought-out access onto a single server in a virtualization
tions in countries subject to strict data management tools or processes. environment.
residency and sovereignty laws. Ensuring Sprawl can also plague the public Using these deployment option best
compliance in these regions is imper- cloud, and strategies to contain it must practices as a starting point will help
ative, especially for industries such as be embedded into an organizations an organization determine the right
financial services and healthcare. cloud initiatives. database for its environment based on
performance, security and compliance
HYBRID CLOUD APPLIANCE BASED needs. n
A hybrid cloud is flexible and customiz- An appliance database is utilized by an
able, allowing managers to pick and choose enterprise that has specific needs or
elements of either public or private cloud when there is a tight coupling between
as needs arise. However, this can lead to applications, hardware and Franco Rizzo is senior pre-sales
sprawl, where the growth of underlying the database. architect at TmaxSoft. Franco can be
IT services is uncontrolled, and exceeds Traditionally, this is an on-premise reached at franco.rizzo@tmaxsoft.com.
the resources required. Therefore, its solutioneither managed by a vendor
important to have a governance strategy to or in an enterprises own data center. If a
manage sprawl. In addition, integrating an vendor is chosen, be confident that it is
on-premise cloud into a public one is com- the right fit both now and for the future.
plicated and may lead to security issues, so Appliance databases are often expen-
data migration must be well-architected to sive, so ensure the implementation is
ensure security and reduce complexity. cost-effective over time.
48 JUNE/JULY 2017 | DBTA Sponsored Content

Time Series: The Next Revolution


of Database Technologies
The DB-Engines Ranking ranks data-
base management systems according
to their popularity. This is created and
is maintained by solid IT. Time Series
databases are their own category as
shown in the graph.
WHAT IS TIME SERIES DATA
Time Series are simply measurements
or events that are tracked, monitored,
downsampled and aggregated over time.
This could be server metrics, application
performance metrics, network data, sen-
sor data, events, clicks, trades in a market
and many other types of analytics data.
The key difference with time series data
from regular data is that youre always
these users are building custom monitor- summarization and large range scans of
asking questions about it over time. An
ing solutions from scratch to track their many records are what separate Time
often simple way to determine if the data-
servers, VMs, containers and network Series from other database use cases.
set you are working with is time series or
hardware and later as a generalized metrics Time Series often request a summary
not, is to look and see if one of your axes
platform for application developers within of a larger period of time. This requires
is time.
the organization. Others are using a Time going over a range of data points to per-
Time Series data comes in two forms:
Series Database to supplement commercial form some computation like a percentile
regular and irregular. Regular time series
APM products to instrument portions of to get a summary of the underlying series
consist of measurements gathered at reg-
the stack for which no probes or agents to the user. This kind of workload is very
ular intervals of time (every 10 seconds,
exist, or to stitch together metrics from difficult to optimize for a distributed key
for example). Many people refer to these
multiple monitoring solutions. value store. Time Series is optimized for
as metrics. Irregular time series are event
With real-time analytics we see orga- exactly this use case giving millisecond
driven either by users or other external
nizations at all sizes. Some are building level query times over months of data.
events. Summarizations of irregular series
applications that will face their users with With Time Series its common to keep
become regular themselves. For example,
a Time Series database as the underly- high precision data around for a short
summarizing the average response time
ing database. Others are instrumenting period of time. This data is aggregated and
for requests in an application over 1 min-
business, social or development metrics in downsampled into longer term trend data.
ute intervals or showing the average trade
real-time for internal consumption. Users Every data point that goes into the data-
price of Apple stock every 10 minutes
frequently start with the DevOps monitor- base, will have to be deleted after its period
over the course of a day.
ing and metrics use case, then move into of time is up. This kind of data lifecycle
HORIZONTAL USE CASE real-time analytics once the platform is management is difficult for application
Time Series may seem like a niche and deployed. Time Series databases eventu- developers to implement on top of regular
specific use case. How many organizations ally becomes the central store for all Time databases. They must devise schemes for
or developers actually have a need for a Series, sensor and analytics data. cheaply evicting large sets of data and con-
Time Series solution? Time Series database In the realm of IoT there are many sub- stantly summarizing that data at scale.
customers and users span four primary use cases. Weve seen users in industrial InfluxData is designed as a Time Series
use cases: custom DevOps monitoring settings like factories, oil and gas, agricul- database with solutions built in for sum-
and metrics, Real-time analytics, and IoT/ ture, smart roads and infrastructure. There marization and data lifecycle manage-
sensor data monitoring. Anyone that has are also users in consumer grade IoT like ment. These come out of the box with no
servers, VMs, applications, sensors, users, wearables, consumer devices and trackers. application level code required from the
or events to track could benefit from using developer. To learn more, take a look at
a Times Series database. THE TIME SERIES WORKLOAD our customer use cases. n
For users in the DevOps monitoring Time Series data has a few properties
and metrics space, most are either in that make it very different than other data INFLUX DATA
medium to large organizations. Some of workloads. Data lifecycle management, www.influxdata.com
Sponsored
Sponsored Content
Content JUNE/JULY
JUNE/JULY 2017
2017 | DBTA 49 49
| DBTA

Couchbase Data
Platform for Exceptional
Engagement in Retail
RETAIL IS BEING DISRUPTED BY that convert interactions into engagement Engagement Database built for change.
DIGITAL TRANSFORMATION that build toward lifelong relationships. Its lightning fast, secure, highly scalable,
The retail universe is in the midst of unpar- Database challenge: Tesco is the cloud-native, always on, and seamlessly
alleled digital transformation. Consumers U.K.s largest retailer. As part of a major mobile. Learn more at Couchbase.com
can now buy anytime, anywhereinclud- initiative to drive greater agility and data
ing on websites, mobile apps, devices, and sharing across channels via a microser- COUCHBASE UNIQUELY HELPS
even gaming consoles. Digital purchases are vices architecture, they needed to provide RETAILERS DRIVE EXCEPTIONAL
climbing, and according to research firm a centralized product catalog service that ENGAGEMENT
eMarketer, worldwide retail eCommerce was easy to maintain and update. The Memory-centric architecture: The
revenues will increase from $1.915 Trillion catalog had to store massive data, support Couchbase Data Platform takes full
in 2016 to over $4 trillion in 2020. a flexible schema for frequent changes advantage of all available memory to give
During this transformation, the key to product data, and provide very low applications the sub-millisecond respon-
challenges facing retailers are rising cus- latency access to millions of documents. siveness your shoppers expect.
tomer experience expectations, omnichan- Solution: Tesco uses Couchbase to store Integrated cache: While other data-
nel delivery requirements, and support of a variety of information, including SKUs, bases like MongoDB require a third-party
unpredictable demands. Retailers must find product and accounting hierarchies, and cacheadding cost and complexitythe
a way to deliver exceptional customer expe- GTINs (barcodes and ISBNs). The data is Couchbase Data Platform has a fully
riences with fast, personalized, and context- ingested at great velocity from multiple data integrated cache that delivers blazing
and location-aware engagement. Simulta- formats. Couchbase easily and inexpensively performance.
neously, they must manage exponentially scales to support data for 10 million prod- Powerful, SQL-based query language:
growing volumes of users and data, while ucts at 35,000 requests per second. Only Couchbase provides N1QL (nickel)
reducing cost and time to market. Database challenge: eBay is a global a powerful query language that lets
eCommerce leader. eBays costly trans- developers easily query JSON data using
LEGACY TRANSACTIONAL DATA- actional database lacked native sharding familiar, SQL-like expressions.
BASES CANT MEET THE CHALLENGE and robust replication, and inhibited Built-in high availability and disas-
It takes nearly flawless engagement the customer experience due to subpar ter recovery: Couchbase comes with
throughout the customer lifecycle to performance. eBay needed an engagement high availability within a cluster and
create lasting relationships with todays database to support tens of billions of market-leading XDCR capabilities to
demanding consumers, especially when database calls/day with geo-distributed support disaster recovery and data locality
their switching costs are so low. replication and sharding. requirements. You have full control over
Legacy transactional databases have Solution: For operational dashboarding the topologyunidirectional, bidirec-
rigid schemas, poor scalability, perfor- and session database functionality, eBay tional, or any configuration.
mance challenges, and high scale-up now runs Couchbase Server across 1,400 Complete GUI-based admin console:
costs. Theyre simply not built to meet nodes and 2,000 servers. Couchbase Couchbase has a fully integrated GUI-
the demands of modern retail customers. serves as both a key-value store and a based management console, complete
Thats why hundreds of digital retail document database for multiple applica- with hundreds of prebuilt metrics and
leaders, including Walmart, Tesco, tions with over 80 billion database calls/ easy-to-use tools like push-button scaling,
Auchan, Ladbrokes, eBay, Fanatics, day. With Couchbase, eBay also achieved rebalancing, and memory tuning.
StubHub, Staples, and Cars.com, are active-active bidirectional replication Always-on mobile support: Couchbase
embracing next-generation engagement with built-in cross datacenter replication Mobile is a complete solution for mobile
databases to build and run their modern (XDCR), a flexible document model for application support. It includes Couchbase
web, mobile, and IoT applications. developer agility, and SQL integration Lite (an embedded database for devices)
with N1QL (Couchbases SQL for JSON). and Sync Gateway (a prebuilt solution that
HOW GLOBAL RETAIL LEADERS syncs the device with the cloud). You can
DRIVE SUPERIOR INTERACTIONS THE DATA PLATFORM FOR easily support use cases such as person-
Top retailers use the Couchbase Data Plat- ENGAGEMENT alized in-store apps, POS systems, and
form to create extraordinary experiences The Couchbase Data Platform is the mobile-optimized digital catalogs. n

You might also like