Verizon Global Technology Services (GTS) was challenged by a multi-tier, labor-intensive process when trying to migrate data from disparate sources into a data lake to create financial reports and business insights.
View the webinar on-demand here: https://hortonworks.com/webinar/verizon-centralizes-data-into-data-lake/
TALK TRACK
Hortonworks is powering the future of data.
Whether from data at rest or data in motion, we help our customers tap into all the data.
We give the world’s leading companies and government agencies actionable intelligence to do things that were never before possible.
[NEXT SLIDE]
Hadoop Distribution with Yarn: Allows central source of data across all mediums of ingestion and interaction
Existing & Legacy Systems can Contribute and Participate: May extend the life of existing and legacy systems from enriched data
New Applications interact with Data Lake, not each other: Next Generation Apps build around data and can deliver to customers and partners
Sentiment from Social Media
How happy are customers with carrier?
Churn Score if another member of social network leaves carrier?
Customer Churn Score from Call Center Voice-2-text & voice tone:
How happy are customers with carrier?
What events have occurred that have introduced risk of customer churn?
CDR with Social Media
Bad QOS that has customer not happy with carrier?; Where to prioritized repairs & care service to customer to reduce chances of Churn?
CDR for location with Social Media
Where was customer when bad experience occurred?
Enrich the Data Lake with all available customer interaction points
Develop and Revise the Models that best get your customer metrics:
Net Promoter Score, Customer Churn Score, Appetite for Information, Customer Target Profile, etc
Deploy the Current State of the Customer with references and customer real-time metrics
Computed (ML) from the Data Lake into a Dynamic Customer Profile that is available across business groups and systems
What the industry cares about:
Hadoop has moved out of test
Enterprise Use Case
Closer to Production
Business impact
Enterprise + Real-time VS Sqoop + Batch
Attunity + Replicate
High performance connectivity to Hadoop though native APIs for data ingest and publication
Automated schema generation in Hcatalog
Drag & drop configuration with Click-2-Replicate design
High-speed data load options:
Full reload with overwrite
Insert only appends
Change Data Capture(CDC)
In-memory data filtering and transformation
Monitoring dashboard with web-based metrics, alerts and log file management
One of the reasons several large technology companies trust and rely on Attunity for their own solutions is because of the robust CDC capability that Replicate provides.
There are several options that are built into the product that provide flexible and optimized ways to implement change data capture.
In addition to applying transactions in real-time and in-order, Replicate can handle varying volumes of changes on the source systems by applying the changes in optimized batches to improve throughput and latency
In order to provide high-speed data loads into data warehouse appliances, Replicate is integrated with native data warehouse loaders for fast data ingestion into the target and then changes are merged in the target. It does not rely on sub optimal ODBC for data loading into the ware house systems.
And recently, Attunity added support to write changes in message encoded format that can be published to Kafka message brokers as well.