SlideShare ist ein Scribd-Unternehmen logo
1 von 26
© Copyright 1/15/2015 BMC Software, Inc1
Joe Goldberg
Solutions Marketing
October 2014
Help your Enterprise Implement
Big Data with Control-M for
Hadoop
© Copyright 1/15/2015 BMC Software, Inc3
Who is using Hadoop and why?
© Copyright 1/15/2015 BMC Software, Inc4
Who is using Big Data?
$
• New Account Risk Screens
• Fraud Prevention
• Trading Risk
• Insurance Underwriting
• Accelerate Loan Processing
Financial Services
• 360° View of the Customer
• Analyze Brand Sentiment
• Personalized Promotions
• Website Optimization
• Optimal Store Layout
Retail
• Call Detail Records (CDRs)
• Infrastructure Investment
• Next Product to Buy (NPTB)
• Real-time Bandwidth Allocation
• New Product Development
Telecom
Healthcare
• Genomic data for medical trials
• Monitor patient vitals
• Reduce re-admittance rates
• Store medical research data
• Recruiting for pharmaceutical trials
Utilities, Oil & Gas
• Smart meter stream analysis
• Optimize lease bidding
• Compliance reporting
• Proactive equipment repair
• Seismic image processing
Public Sector
• Analyze public sentiment
• Protect critical networks
• Prevent fraud and waste
• Crowdsourced reporting
• Fulfill open records requests
© Copyright 1/15/2015 BMC Software, Inc5
Hadoop
Lots
of
Data
Traditional
© Copyright 1/15/2015 BMC Software, Inc6
The Players
© Copyright 1/15/2015 BMC Software, Inc7
Building a (Hadoop) Business Service
© Copyright 1/15/2015 BMC Software, Inc8
• Identify Data Sources and Targets
• Write code
• Test
• Deploy
• Production
The Steps
© Copyright 1/15/2015 BMC Software, Inc9
© Copyright 1/15/2015 BMC Software, Inc10
SQL Data Source
• SQL Server
– Write SQL script/Stored Procedure
– Learn SQL Agent Job Definition
– Write powershell script/bat file
– Define job
– Run job
© Copyright 1/15/2015 BMC Software, Inc11
ETL Data Source
• Informatica
– Build Informatica workflows
– Learn PowerCenter Scheduler
– Write Scripts
– Build PowerCenter job
– Run job
© Copyright 1/15/2015 BMC Software, Inc12
Files Data Source
• Move files with FTP
– Learn FTP tool
– Write scripts
– Run FTP
© Copyright 1/15/2015 BMC Software, Inc13
Run Hadoop Jobs
• Oozie/Hue
– Write the MapReduce, Pig
– Learn Oozie
– Write scripts
– Build workflows
© Copyright 1/15/2015 BMC Software, Inc14
SQL Query
#!/usr/bin/sh
# Sample pmcmdscript
set pagesize 0 linesize 80 feedback off
SELECT 'The database ' || instance_name ||
' has been running since ' || to_char(startup_time, 'HH24:MI MM/DD/YYYY')
FROM v$instance;
SELECT 'There are ' || count(status) ||
' data files with a status of ' || status
FROM dba_data_files
GROUP BY status
ORDER BY status;
SELECT 'The total storage used by the data files is ' ||
sum(bytes)/1024/1024 || ' MB'
FROM dba_data_files;
#!/usr/bin/env bash
bin=`dirname"$0"`
bin=`cd "$bin"; pwd`
. "$bin"/../libexec/hadoop-config.sh
#set the hadoopcommandand the path to the hadoop jar
HADOOP_CMD="${HADOOP_PREFIX}/bin/hadoop --config $HADOOP_CONF_DIR“
#find the hadoop jar
HADOOP_JAR='‘
#find under HADOOP_PREFIX (tar ball install)
HADOOP_JAR=`find ${HADOOP_PREFIX} -name 'hadoop--*.jar' | head -n1`
#if its not found look under /usr/share/hadoop(rpm/deb installs)
if [ "$HADOOP_JAR"== '' ]then
HADOOP_JAR=`find /usr/share/hadoop-name 'hadoop--*.jar' |
head -n1`
fi
#if it is still empty then dont run the tests
if [ "$HADOOP_JAR"== '' ]then
echo "Did not find hadoop--*.jarunder '${HADOOP_PREFIX} or
'/usr/share/hadoop'"
exit 1
fi
#dir where to store the data on hdfs. The data is relative of the users home dir on hdfs.
PARENT_DIR="validate_deploy_`date+%s`“
TERA_GEN_OUTPUT_DIR="${PARENT_DIR}/tera_gen_data“
TERA_SORT_OUTPUT_DIR="${PARENT_DIR}/tera_sort_data“
Hadoop
#!/bin/ksh
cd /home/bmcU1ser/ftp_race_source
sftp -b /dev/stdin -o Cipher=blowfish -o Compression=yes -o BatchMode=yes -o
IdentityFile=/export/home/user/.ssh/id_rsa -o Port=22 bmcUs1ser@hou-hadoop-
mstr 1>sftp.log 2>&1 <<ENDSFTP
if [ -f /home/bmcU1ser/ftp_race_target/daily_shipment_log]; then
exit 1
else
put daily_shipment_log/home/bmcU1ser/ftp_race_target
fi
quit
ENDSFTP
rc=$?
if [[ $rc != 0 ]]; then
print "***Erroroccurred...$rc" `date "+%Y-%m-%d-%H.%M.%S"`
if [[ -f /home/bmcU1ser/ftp_race_target/daily_shipment_log ]];
then
rm /home/bmcU1ser/ftp_race_target/daily_shipment_log
fi
else
mv /home/bmcU1ser/ftp_race_source/daily_shipment_log
/home/bmcU1ser/ftp_race_source/old/daily_shipment_log
print "***Successful transfer...$rc" `date "+%Y-%m-%d-%H.%M.%S"`
fi
File TransferInformatica
#!/usr/bin/bash
# Sample pmcmdscript
# Check if the service is alive
pmcmd pingservice -sv testService -d testDomain
if [ "$?" != 0 ]; then
# handle error
echo "Could not ping service"
exit
fi
# Get service properties
pmcmd getserviceproperties -sv testService -d testDomain
if [ "$?" != 0 ]; then
# handle error
echo "Could not get service properties"
exit
fi
# Get task details for session task "s_testSessionTask" of workflow
# "wf_test_workflow" in folder "testFolder"
pmcmd gettaskdetails -sv testService -d testDomain -u Administrator -p adminPass
-folder testFolder -workflow wf_test_workflow s_testSessionTask
if [ "$?" != 0 ]; then
# handle error
echo "Could not get details for task s_testSessionTask"
exit
fi
Programmers program
© Copyright 1/15/2015 BMC Software, Inc15
What happens when this runs?
• What is related to what?
• Are we on time or late?
• What if something fails?
– Which program was running?
– Where is the output?
– How do I fix it?
– Can I just rerun it? If so,
from the beginning?
– Does any cleanup have to be done?
– How do I track this problem and the steps
taken to resolve the problem?
#!/usr/bin/env bash
bin=`dirname"$0"`
bin=`cd "$bin"; pwd`
. "$bin"/../libexec/hadoop-config.sh
#set the hadoopcommandand the path to the hadoop jar
HADOOP_CMD="${HADOOP_PREFIX}/bin/hadoop --config $HADOOP_CONF_DIR“
#find the hadoop jar
HADOOP_JAR='‘
#find under HADOOP_PREFIX (tar ball install)
HADOOP_JAR=`find ${HADOOP_PREFIX} -name 'hadoop--*.jar' | head -n1`
#if its not found look under /usr/share/hadoop(rpm/deb installs)
if [ "$HADOOP_JAR"== '' ]then
HADOOP_JAR=`find /usr/share/hadoop-name 'hadoop--*.jar' |
head -n1`
fi
#if it is still empty then dont run the tests
if [ "$HADOOP_JAR"== '' ]then
echo "Did not find hadoop--*.jarunder '${HADOOP_PREFIX} or
'/usr/share/hadoop'"
exit 1
fi
#dir where to store the data on hdfs. The data is relative of the users home dir on hdfs.
PARENT_DIR="validate_deploy_`date+%s`“
TERA_GEN_OUTPUT_DIR="${PARENT_DIR}/tera_gen_data“
TERA_SORT_OUTPUT_DIR="${PARENT_DIR}/tera_sort_data“
Hadoop
© Copyright 1/15/2015 BMC Software, Inc16
SQL Query HadoopFile TransferInformatica
A Better Way
© Copyright 1/15/2015 BMC Software, Inc17
Defining Control-M for Hadoop jobs
Set Script parameters
Hadoop Program
parameters
HDFS commands
• get
• put
• rm
• move
• rename
Supports all Apache
Distributions (0.x-2.x):
• Cloudera
• Hortonworks
• MapR
• Pivotal
• BigInsights
© Copyright 1/15/2015 BMC Software, Inc18
Building a Hadoop Business Process
HDFS
Java MapReduce
Pig
Hive
Sqoop
File Transfer
Informatica
DataStage
Business Objects
Cognos
Oracle
Sybase
SQL Server
SSIS
PostgreSQL
z/OS
Linux/Unix/Windows
Amazon EC2 / VMware
NetBackup / TSM
SAP / OEBS / Peoplesoft
© Copyright 1/15/2015 BMC Software, Inc19
Connection Profile
© Copyright 1/15/2015 BMC Software, Inc20
Monitoring Workflows
Resource Manager report
© Copyright 1/15/2015 BMC Software, Inc21
Workload Conversion & Discovery
© Copyright 1/15/2015 BMC Software, Inc22
BMC Control-M Workload Automation
Hadoop Application Developers
Write programs Build Hadoop jobs Add Pre/Post Jobs
Access for the
Business
IT Scheduler
Pig
Hive
MapReduce
Sqoop
HDFS File Watcher
© Copyright 1/15/2015 BMC Software, Inc23
And the fun is just beginning…
© Copyright 1/15/2015 BMC Software, Inc24
Partner
Why BMC Control-M for Hadoop?
© Copyright 1/15/2015 BMC Software, Inc25
Key Takeaways
 Eliminate scripting with built-in
capabilities
 Reduce the complexity of
building and testing applications
 Production Applications run
more reliably, are easier to
monitor and ensure compliance
Big Data and Hadoop are coming to your Data Center
 Easily build composite
applications leveraging the full
power of your technology fabric
Build Applications Faster Increase Service Quality Gain Business Agility
© Copyright 1/15/2015 BMC Software, Inc26
For Additional Information: www.bmc.com/hadoop
© Copyright 1/15/2015 BMC Software, Inc27
Thank You.
Be sure to visit Control-M Labs

Weitere ähnliche Inhalte

Was ist angesagt?

Ingesting Data at Blazing Speed Using Apache Orc
Ingesting Data at Blazing Speed Using Apache OrcIngesting Data at Blazing Speed Using Apache Orc
Ingesting Data at Blazing Speed Using Apache OrcDataWorks Summit
 
Data on the Move: Transitioning from a Legacy Architecture to a Big Data Plat...
Data on the Move: Transitioning from a Legacy Architecture to a Big Data Plat...Data on the Move: Transitioning from a Legacy Architecture to a Big Data Plat...
Data on the Move: Transitioning from a Legacy Architecture to a Big Data Plat...MapR Technologies
 
Common and unique use cases for Apache Hadoop
Common and unique use cases for Apache HadoopCommon and unique use cases for Apache Hadoop
Common and unique use cases for Apache HadoopBrock Noland
 
1 - The Case for Trafodion
1 - The Case for Trafodion1 - The Case for Trafodion
1 - The Case for TrafodionRohit Jain
 
2 - Trafodion and Hadoop HBase
2 - Trafodion and Hadoop HBase2 - Trafodion and Hadoop HBase
2 - Trafodion and Hadoop HBaseRohit Jain
 
Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNDataWorks Summit
 
How to Leverage Mainframe Data with Hadoop: Bridging the Gap Between Big Iron...
How to Leverage Mainframe Data with Hadoop: Bridging the Gap Between Big Iron...How to Leverage Mainframe Data with Hadoop: Bridging the Gap Between Big Iron...
How to Leverage Mainframe Data with Hadoop: Bridging the Gap Between Big Iron...Precisely
 
Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing DataWorks Summit
 
Hadoop crash course workshop at Hadoop Summit
Hadoop crash course workshop at Hadoop SummitHadoop crash course workshop at Hadoop Summit
Hadoop crash course workshop at Hadoop SummitDataWorks Summit
 
Trafodion overview
Trafodion overviewTrafodion overview
Trafodion overviewRohit Jain
 
Nov 2010 HUG: Business Intelligence for Big Data
Nov 2010 HUG: Business Intelligence for Big DataNov 2010 HUG: Business Intelligence for Big Data
Nov 2010 HUG: Business Intelligence for Big DataYahoo Developer Network
 
Whats New in Postgres 12
Whats New in Postgres 12Whats New in Postgres 12
Whats New in Postgres 12EDB
 
Preventative Maintenance of Robots in Automotive Industry
Preventative Maintenance of Robots in Automotive IndustryPreventative Maintenance of Robots in Automotive Industry
Preventative Maintenance of Robots in Automotive IndustryDataWorks Summit/Hadoop Summit
 
Big Data Expo 2015 - Hortonworks Common Hadoop Use Cases
Big Data Expo 2015 - Hortonworks Common Hadoop Use CasesBig Data Expo 2015 - Hortonworks Common Hadoop Use Cases
Big Data Expo 2015 - Hortonworks Common Hadoop Use CasesBigDataExpo
 
The Challenges of SQL on Hadoop
The Challenges of SQL on HadoopThe Challenges of SQL on Hadoop
The Challenges of SQL on HadoopDataWorks Summit
 
Java Batch for Cost Optimized Efficiency
Java Batch for Cost Optimized EfficiencyJava Batch for Cost Optimized Efficiency
Java Batch for Cost Optimized EfficiencySridharSudarsan
 
Integrated Data Warehouse with Hadoop and Oracle Database
Integrated Data Warehouse with Hadoop and Oracle DatabaseIntegrated Data Warehouse with Hadoop and Oracle Database
Integrated Data Warehouse with Hadoop and Oracle DatabaseGwen (Chen) Shapira
 

Was ist angesagt? (20)

Ingesting Data at Blazing Speed Using Apache Orc
Ingesting Data at Blazing Speed Using Apache OrcIngesting Data at Blazing Speed Using Apache Orc
Ingesting Data at Blazing Speed Using Apache Orc
 
NoSQL Needs SomeSQL
NoSQL Needs SomeSQLNoSQL Needs SomeSQL
NoSQL Needs SomeSQL
 
Data on the Move: Transitioning from a Legacy Architecture to a Big Data Plat...
Data on the Move: Transitioning from a Legacy Architecture to a Big Data Plat...Data on the Move: Transitioning from a Legacy Architecture to a Big Data Plat...
Data on the Move: Transitioning from a Legacy Architecture to a Big Data Plat...
 
Common and unique use cases for Apache Hadoop
Common and unique use cases for Apache HadoopCommon and unique use cases for Apache Hadoop
Common and unique use cases for Apache Hadoop
 
1 - The Case for Trafodion
1 - The Case for Trafodion1 - The Case for Trafodion
1 - The Case for Trafodion
 
2 - Trafodion and Hadoop HBase
2 - Trafodion and Hadoop HBase2 - Trafodion and Hadoop HBase
2 - Trafodion and Hadoop HBase
 
Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeN
 
How to Leverage Mainframe Data with Hadoop: Bridging the Gap Between Big Iron...
How to Leverage Mainframe Data with Hadoop: Bridging the Gap Between Big Iron...How to Leverage Mainframe Data with Hadoop: Bridging the Gap Between Big Iron...
How to Leverage Mainframe Data with Hadoop: Bridging the Gap Between Big Iron...
 
Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing
 
Hadoop crash course workshop at Hadoop Summit
Hadoop crash course workshop at Hadoop SummitHadoop crash course workshop at Hadoop Summit
Hadoop crash course workshop at Hadoop Summit
 
Trafodion overview
Trafodion overviewTrafodion overview
Trafodion overview
 
Nov 2010 HUG: Business Intelligence for Big Data
Nov 2010 HUG: Business Intelligence for Big DataNov 2010 HUG: Business Intelligence for Big Data
Nov 2010 HUG: Business Intelligence for Big Data
 
Whats New in Postgres 12
Whats New in Postgres 12Whats New in Postgres 12
Whats New in Postgres 12
 
Preventative Maintenance of Robots in Automotive Industry
Preventative Maintenance of Robots in Automotive IndustryPreventative Maintenance of Robots in Automotive Industry
Preventative Maintenance of Robots in Automotive Industry
 
Big Data Expo 2015 - Hortonworks Common Hadoop Use Cases
Big Data Expo 2015 - Hortonworks Common Hadoop Use CasesBig Data Expo 2015 - Hortonworks Common Hadoop Use Cases
Big Data Expo 2015 - Hortonworks Common Hadoop Use Cases
 
What's new in Ambari
What's new in AmbariWhat's new in Ambari
What's new in Ambari
 
The Challenges of SQL on Hadoop
The Challenges of SQL on HadoopThe Challenges of SQL on Hadoop
The Challenges of SQL on Hadoop
 
Java Batch for Cost Optimized Efficiency
Java Batch for Cost Optimized EfficiencyJava Batch for Cost Optimized Efficiency
Java Batch for Cost Optimized Efficiency
 
The Heterogeneous Data lake
The Heterogeneous Data lakeThe Heterogeneous Data lake
The Heterogeneous Data lake
 
Integrated Data Warehouse with Hadoop and Oracle Database
Integrated Data Warehouse with Hadoop and Oracle DatabaseIntegrated Data Warehouse with Hadoop and Oracle Database
Integrated Data Warehouse with Hadoop and Oracle Database
 

Andere mochten auch

BMC Control M
BMC Control MBMC Control M
BMC Control MHai Tran
 
BMC Control M Advantage
BMC Control M Advantage BMC Control M Advantage
BMC Control M Advantage Vyom Labs
 
The Power of Simple: Whats New in BMC Control-M 8
The Power of Simple: Whats New in BMC Control-M 8The Power of Simple: Whats New in BMC Control-M 8
The Power of Simple: Whats New in BMC Control-M 8BMC Software
 
How Big Data and Hadoop Integrated into BMC ControlM at CARFAX
How Big Data and Hadoop Integrated into BMC ControlM at CARFAXHow Big Data and Hadoop Integrated into BMC ControlM at CARFAX
How Big Data and Hadoop Integrated into BMC ControlM at CARFAXBMC Software
 
BMC Control-M 2013 Survey Results
BMC Control-M 2013 Survey ResultsBMC Control-M 2013 Survey Results
BMC Control-M 2013 Survey ResultsBMC Software
 
BSM for Cloud Computing
BSM for Cloud ComputingBSM for Cloud Computing
BSM for Cloud ComputingBMC Software
 
Data Segregation for Remedyforce SaaS Help Desk and High-Speed Digital Servic...
Data Segregation for Remedyforce SaaS Help Desk and High-Speed Digital Servic...Data Segregation for Remedyforce SaaS Help Desk and High-Speed Digital Servic...
Data Segregation for Remedyforce SaaS Help Desk and High-Speed Digital Servic...BMC Software
 
Salesforce and Remedyforce ISV Tech Talk: Pushing New Versions of your App
Salesforce and Remedyforce ISV Tech Talk: Pushing New Versions of your AppSalesforce and Remedyforce ISV Tech Talk: Pushing New Versions of your App
Salesforce and Remedyforce ISV Tech Talk: Pushing New Versions of your AppBMC Software
 
The Power of Monitoring Studio in TrueSight
The Power of Monitoring Studio in TrueSightThe Power of Monitoring Studio in TrueSight
The Power of Monitoring Studio in TrueSightBMC Software
 
Data Migration for Remedyforce SaaS Help Desk and High-Speed Digital Service ...
Data Migration for Remedyforce SaaS Help Desk and High-Speed Digital Service ...Data Migration for Remedyforce SaaS Help Desk and High-Speed Digital Service ...
Data Migration for Remedyforce SaaS Help Desk and High-Speed Digital Service ...BMC Software
 
Salesforce Lightning Process Builder IS the next-generation workflow tool
Salesforce Lightning Process Builder IS the next-generation workflow toolSalesforce Lightning Process Builder IS the next-generation workflow tool
Salesforce Lightning Process Builder IS the next-generation workflow toolBMC Software
 
Remedyforce Localization and Translation
Remedyforce Localization and TranslationRemedyforce Localization and Translation
Remedyforce Localization and TranslationBMC Software
 
User Creation and Authentication in Remedyforce
User Creation and Authentication in RemedyforceUser Creation and Authentication in Remedyforce
User Creation and Authentication in RemedyforceBMC Software
 
What Do Executives Need to Do to Go Digital?
What Do Executives Need to Do to Go Digital?What Do Executives Need to Do to Go Digital?
What Do Executives Need to Do to Go Digital?BMC Software
 
Next Generation Technology Utility Benchmarks
Next Generation Technology Utility BenchmarksNext Generation Technology Utility Benchmarks
Next Generation Technology Utility BenchmarksBMC Software
 
How Will Your Cloud Strategy Impact Your Cyber Strategy?
How Will Your Cloud Strategy Impact Your Cyber Strategy?How Will Your Cloud Strategy Impact Your Cyber Strategy?
How Will Your Cloud Strategy Impact Your Cyber Strategy?BMC Software
 
IT Managers Answer Questions about the Future of the Digital Economy
IT Managers Answer Questions about the Future of the Digital EconomyIT Managers Answer Questions about the Future of the Digital Economy
IT Managers Answer Questions about the Future of the Digital EconomyBMC Software
 
Discover Red Hat and Apache Hadoop for the Modern Data Architecture - Part 3
Discover Red Hat and Apache Hadoop for the Modern Data Architecture - Part 3Discover Red Hat and Apache Hadoop for the Modern Data Architecture - Part 3
Discover Red Hat and Apache Hadoop for the Modern Data Architecture - Part 3Hortonworks
 
MasterCard Optimizes Big Data Management with BMC High Speed Utilities for DB2®
MasterCard Optimizes Big Data Management with BMC High Speed Utilities for DB2® MasterCard Optimizes Big Data Management with BMC High Speed Utilities for DB2®
MasterCard Optimizes Big Data Management with BMC High Speed Utilities for DB2® BMC Software
 

Andere mochten auch (20)

BMC Control M
BMC Control MBMC Control M
BMC Control M
 
BMC Control M Advantage
BMC Control M Advantage BMC Control M Advantage
BMC Control M Advantage
 
The Power of Simple: Whats New in BMC Control-M 8
The Power of Simple: Whats New in BMC Control-M 8The Power of Simple: Whats New in BMC Control-M 8
The Power of Simple: Whats New in BMC Control-M 8
 
How Big Data and Hadoop Integrated into BMC ControlM at CARFAX
How Big Data and Hadoop Integrated into BMC ControlM at CARFAXHow Big Data and Hadoop Integrated into BMC ControlM at CARFAX
How Big Data and Hadoop Integrated into BMC ControlM at CARFAX
 
BMC Control-M 2013 Survey Results
BMC Control-M 2013 Survey ResultsBMC Control-M 2013 Survey Results
BMC Control-M 2013 Survey Results
 
BSM for Cloud Computing
BSM for Cloud ComputingBSM for Cloud Computing
BSM for Cloud Computing
 
Data Segregation for Remedyforce SaaS Help Desk and High-Speed Digital Servic...
Data Segregation for Remedyforce SaaS Help Desk and High-Speed Digital Servic...Data Segregation for Remedyforce SaaS Help Desk and High-Speed Digital Servic...
Data Segregation for Remedyforce SaaS Help Desk and High-Speed Digital Servic...
 
Salesforce and Remedyforce ISV Tech Talk: Pushing New Versions of your App
Salesforce and Remedyforce ISV Tech Talk: Pushing New Versions of your AppSalesforce and Remedyforce ISV Tech Talk: Pushing New Versions of your App
Salesforce and Remedyforce ISV Tech Talk: Pushing New Versions of your App
 
The Power of Monitoring Studio in TrueSight
The Power of Monitoring Studio in TrueSightThe Power of Monitoring Studio in TrueSight
The Power of Monitoring Studio in TrueSight
 
Data Migration for Remedyforce SaaS Help Desk and High-Speed Digital Service ...
Data Migration for Remedyforce SaaS Help Desk and High-Speed Digital Service ...Data Migration for Remedyforce SaaS Help Desk and High-Speed Digital Service ...
Data Migration for Remedyforce SaaS Help Desk and High-Speed Digital Service ...
 
Salesforce Lightning Process Builder IS the next-generation workflow tool
Salesforce Lightning Process Builder IS the next-generation workflow toolSalesforce Lightning Process Builder IS the next-generation workflow tool
Salesforce Lightning Process Builder IS the next-generation workflow tool
 
Remedyforce Localization and Translation
Remedyforce Localization and TranslationRemedyforce Localization and Translation
Remedyforce Localization and Translation
 
User Creation and Authentication in Remedyforce
User Creation and Authentication in RemedyforceUser Creation and Authentication in Remedyforce
User Creation and Authentication in Remedyforce
 
What Do Executives Need to Do to Go Digital?
What Do Executives Need to Do to Go Digital?What Do Executives Need to Do to Go Digital?
What Do Executives Need to Do to Go Digital?
 
Next Generation Technology Utility Benchmarks
Next Generation Technology Utility BenchmarksNext Generation Technology Utility Benchmarks
Next Generation Technology Utility Benchmarks
 
How Will Your Cloud Strategy Impact Your Cyber Strategy?
How Will Your Cloud Strategy Impact Your Cyber Strategy?How Will Your Cloud Strategy Impact Your Cyber Strategy?
How Will Your Cloud Strategy Impact Your Cyber Strategy?
 
IT Managers Answer Questions about the Future of the Digital Economy
IT Managers Answer Questions about the Future of the Digital EconomyIT Managers Answer Questions about the Future of the Digital Economy
IT Managers Answer Questions about the Future of the Digital Economy
 
Discover Red Hat and Apache Hadoop for the Modern Data Architecture - Part 3
Discover Red Hat and Apache Hadoop for the Modern Data Architecture - Part 3Discover Red Hat and Apache Hadoop for the Modern Data Architecture - Part 3
Discover Red Hat and Apache Hadoop for the Modern Data Architecture - Part 3
 
MasterCard Optimizes Big Data Management with BMC High Speed Utilities for DB2®
MasterCard Optimizes Big Data Management with BMC High Speed Utilities for DB2® MasterCard Optimizes Big Data Management with BMC High Speed Utilities for DB2®
MasterCard Optimizes Big Data Management with BMC High Speed Utilities for DB2®
 
What is AWS?
What is AWS?What is AWS?
What is AWS?
 

Ähnlich wie Help your Enterprise Implement Big Data with Control-M for Hadoop

Sql saturday pig session (wes floyd) v2
Sql saturday   pig session (wes floyd) v2Sql saturday   pig session (wes floyd) v2
Sql saturday pig session (wes floyd) v2Wes Floyd
 
Delegated Configuration with Multiple Hiera Databases - PuppetConf 2014
Delegated Configuration with Multiple Hiera Databases - PuppetConf 2014Delegated Configuration with Multiple Hiera Databases - PuppetConf 2014
Delegated Configuration with Multiple Hiera Databases - PuppetConf 2014Puppet
 
Introduction to Apache NiFi 1.11.4
Introduction to Apache NiFi 1.11.4Introduction to Apache NiFi 1.11.4
Introduction to Apache NiFi 1.11.4Timothy Spann
 
Taming Big Data with Big SQL 3.0
Taming Big Data with Big SQL 3.0Taming Big Data with Big SQL 3.0
Taming Big Data with Big SQL 3.0Nicolas Morales
 
Visual Mapping of Clickstream Data
Visual Mapping of Clickstream DataVisual Mapping of Clickstream Data
Visual Mapping of Clickstream DataDataWorks Summit
 
Big Data for Security - DNS Analytics
Big Data for Security - DNS AnalyticsBig Data for Security - DNS Analytics
Big Data for Security - DNS AnalyticsMarco Casassa Mont
 
Data Architectures for Robust Decision Making
Data Architectures for Robust Decision MakingData Architectures for Robust Decision Making
Data Architectures for Robust Decision MakingGwen (Chen) Shapira
 
KACE Agent Architecture and Troubleshooting Overview
KACE Agent Architecture and Troubleshooting OverviewKACE Agent Architecture and Troubleshooting Overview
KACE Agent Architecture and Troubleshooting OverviewDell World
 
Rock Solid Deployment of Web Applications
Rock Solid Deployment of Web ApplicationsRock Solid Deployment of Web Applications
Rock Solid Deployment of Web ApplicationsPablo Godel
 
Seven steps to better security
Seven steps to better securitySeven steps to better security
Seven steps to better securityMichael Pignataro
 
Exalytics, DR, EPM Multi-Instance Over Bare Metal, and Tying it All Together
Exalytics, DR, EPM Multi-Instance Over Bare Metal, and Tying it All TogetherExalytics, DR, EPM Multi-Instance Over Bare Metal, and Tying it All Together
Exalytics, DR, EPM Multi-Instance Over Bare Metal, and Tying it All TogetherAlithya
 
PayPal datalake journey | teradata - edge of next | san diego | 2017 october ...
PayPal datalake journey | teradata - edge of next | san diego | 2017 october ...PayPal datalake journey | teradata - edge of next | san diego | 2017 october ...
PayPal datalake journey | teradata - edge of next | san diego | 2017 october ...Deepak Chandramouli
 
Cloud patterns applied
Cloud patterns appliedCloud patterns applied
Cloud patterns appliedLars Fronius
 
Monitoring IAAS & PAAS Solutions
Monitoring IAAS & PAAS SolutionsMonitoring IAAS & PAAS Solutions
Monitoring IAAS & PAAS SolutionsColloquium
 
Postgres.foreign.data.wrappers.2015
Postgres.foreign.data.wrappers.2015Postgres.foreign.data.wrappers.2015
Postgres.foreign.data.wrappers.2015EDB
 
Hadoop World 2011: Building Web Analytics Processing on Hadoop at CBS Interac...
Hadoop World 2011: Building Web Analytics Processing on Hadoop at CBS Interac...Hadoop World 2011: Building Web Analytics Processing on Hadoop at CBS Interac...
Hadoop World 2011: Building Web Analytics Processing on Hadoop at CBS Interac...Cloudera, Inc.
 
Dok Talks #124 - Intro to Druid on Kubernetes
Dok Talks #124 - Intro to Druid on KubernetesDok Talks #124 - Intro to Druid on Kubernetes
Dok Talks #124 - Intro to Druid on KubernetesDoKC
 

Ähnlich wie Help your Enterprise Implement Big Data with Control-M for Hadoop (20)

Sql saturday pig session (wes floyd) v2
Sql saturday   pig session (wes floyd) v2Sql saturday   pig session (wes floyd) v2
Sql saturday pig session (wes floyd) v2
 
Delegated Configuration with Multiple Hiera Databases - PuppetConf 2014
Delegated Configuration with Multiple Hiera Databases - PuppetConf 2014Delegated Configuration with Multiple Hiera Databases - PuppetConf 2014
Delegated Configuration with Multiple Hiera Databases - PuppetConf 2014
 
Introduction to Apache NiFi 1.11.4
Introduction to Apache NiFi 1.11.4Introduction to Apache NiFi 1.11.4
Introduction to Apache NiFi 1.11.4
 
Taming Big Data with Big SQL 3.0
Taming Big Data with Big SQL 3.0Taming Big Data with Big SQL 3.0
Taming Big Data with Big SQL 3.0
 
Visual Mapping of Clickstream Data
Visual Mapping of Clickstream DataVisual Mapping of Clickstream Data
Visual Mapping of Clickstream Data
 
Big Data for Security - DNS Analytics
Big Data for Security - DNS AnalyticsBig Data for Security - DNS Analytics
Big Data for Security - DNS Analytics
 
Big Data
Big DataBig Data
Big Data
 
Data Architectures for Robust Decision Making
Data Architectures for Robust Decision MakingData Architectures for Robust Decision Making
Data Architectures for Robust Decision Making
 
KACE Agent Architecture and Troubleshooting Overview
KACE Agent Architecture and Troubleshooting OverviewKACE Agent Architecture and Troubleshooting Overview
KACE Agent Architecture and Troubleshooting Overview
 
Rock Solid Deployment of Web Applications
Rock Solid Deployment of Web ApplicationsRock Solid Deployment of Web Applications
Rock Solid Deployment of Web Applications
 
Seven steps to better security
Seven steps to better securitySeven steps to better security
Seven steps to better security
 
Exalytics, DR, EPM Multi-Instance Over Bare Metal, and Tying it All Together
Exalytics, DR, EPM Multi-Instance Over Bare Metal, and Tying it All TogetherExalytics, DR, EPM Multi-Instance Over Bare Metal, and Tying it All Together
Exalytics, DR, EPM Multi-Instance Over Bare Metal, and Tying it All Together
 
PayPal datalake journey | teradata - edge of next | san diego | 2017 october ...
PayPal datalake journey | teradata - edge of next | san diego | 2017 october ...PayPal datalake journey | teradata - edge of next | san diego | 2017 october ...
PayPal datalake journey | teradata - edge of next | san diego | 2017 october ...
 
Cloud patterns applied
Cloud patterns appliedCloud patterns applied
Cloud patterns applied
 
03 pig intro
03 pig intro03 pig intro
03 pig intro
 
OOP 2014
OOP 2014OOP 2014
OOP 2014
 
Monitoring IAAS & PAAS Solutions
Monitoring IAAS & PAAS SolutionsMonitoring IAAS & PAAS Solutions
Monitoring IAAS & PAAS Solutions
 
Postgres.foreign.data.wrappers.2015
Postgres.foreign.data.wrappers.2015Postgres.foreign.data.wrappers.2015
Postgres.foreign.data.wrappers.2015
 
Hadoop World 2011: Building Web Analytics Processing on Hadoop at CBS Interac...
Hadoop World 2011: Building Web Analytics Processing on Hadoop at CBS Interac...Hadoop World 2011: Building Web Analytics Processing on Hadoop at CBS Interac...
Hadoop World 2011: Building Web Analytics Processing on Hadoop at CBS Interac...
 
Dok Talks #124 - Intro to Druid on Kubernetes
Dok Talks #124 - Intro to Druid on KubernetesDok Talks #124 - Intro to Druid on Kubernetes
Dok Talks #124 - Intro to Druid on Kubernetes
 

Mehr von BMC Software

The Accelerator's Guide to Digital Transformation
The Accelerator's Guide to Digital TransformationThe Accelerator's Guide to Digital Transformation
The Accelerator's Guide to Digital TransformationBMC Software
 
Flip the Switch On Continuous Delivery
Flip the Switch On Continuous DeliveryFlip the Switch On Continuous Delivery
Flip the Switch On Continuous DeliveryBMC Software
 
Peer Into the Bright Future on the Service Desk Horizon
Peer Into the Bright Future on the Service Desk HorizonPeer Into the Bright Future on the Service Desk Horizon
Peer Into the Bright Future on the Service Desk HorizonBMC Software
 
Remedyforce helps General Dynamics meet ever-changing user needs
Remedyforce helps General Dynamics meet ever-changing user needsRemedyforce helps General Dynamics meet ever-changing user needs
Remedyforce helps General Dynamics meet ever-changing user needsBMC Software
 
BMC Software Remedyforce Case Study
BMC Software Remedyforce Case Study BMC Software Remedyforce Case Study
BMC Software Remedyforce Case Study BMC Software
 
Mission: Launch a Digital Workplace
Mission: Launch a Digital Workplace Mission: Launch a Digital Workplace
Mission: Launch a Digital Workplace BMC Software
 
Digital Transformation Playbook: Guide to Unleashing Exponential Growth
Digital Transformation Playbook: Guide to Unleashing Exponential GrowthDigital Transformation Playbook: Guide to Unleashing Exponential Growth
Digital Transformation Playbook: Guide to Unleashing Exponential GrowthBMC Software
 
Curating Your Digital Workplace: Key Steps for IT
Curating Your Digital Workplace: Key Steps for ITCurating Your Digital Workplace: Key Steps for IT
Curating Your Digital Workplace: Key Steps for ITBMC Software
 
Delivering the Digital Workplace Without the Chaos
Delivering the Digital Workplace Without the ChaosDelivering the Digital Workplace Without the Chaos
Delivering the Digital Workplace Without the ChaosBMC Software
 
How to Manage MLC Costs to Optimize the Mainframe
How to Manage MLC Costs to Optimize the MainframeHow to Manage MLC Costs to Optimize the Mainframe
How to Manage MLC Costs to Optimize the MainframeBMC Software
 
BMC Remedyforce vs Other IT Service Management
BMC Remedyforce vs Other IT Service ManagementBMC Remedyforce vs Other IT Service Management
BMC Remedyforce vs Other IT Service ManagementBMC Software
 
3. ITIL Categorization for Remedyforce- An Evolution into Service Desk Self S...
3.	ITIL Categorization for Remedyforce- An Evolution into Service Desk Self S...3.	ITIL Categorization for Remedyforce- An Evolution into Service Desk Self S...
3. ITIL Categorization for Remedyforce- An Evolution into Service Desk Self S...BMC Software
 
Leverage your Salesforce Force.com Platform with Remedyforce Cloud ITSM
Leverage your Salesforce Force.com Platform with Remedyforce Cloud ITSMLeverage your Salesforce Force.com Platform with Remedyforce Cloud ITSM
Leverage your Salesforce Force.com Platform with Remedyforce Cloud ITSMBMC Software
 
Improving IT Skills the Right Way
Improving IT Skills the Right WayImproving IT Skills the Right Way
Improving IT Skills the Right WayBMC Software
 
The Human Factor: Five Tips for Creating the Quintessential Hybrid IT Profess...
The Human Factor: Five Tips for Creating the Quintessential Hybrid IT Profess...The Human Factor: Five Tips for Creating the Quintessential Hybrid IT Profess...
The Human Factor: Five Tips for Creating the Quintessential Hybrid IT Profess...BMC Software
 
What's New in the BMC Remedy Suite
What's New in the BMC Remedy SuiteWhat's New in the BMC Remedy Suite
What's New in the BMC Remedy SuiteBMC Software
 
Sanofi’s Journey to Service Resolution
Sanofi’s Journey to Service ResolutionSanofi’s Journey to Service Resolution
Sanofi’s Journey to Service ResolutionBMC Software
 

Mehr von BMC Software (17)

The Accelerator's Guide to Digital Transformation
The Accelerator's Guide to Digital TransformationThe Accelerator's Guide to Digital Transformation
The Accelerator's Guide to Digital Transformation
 
Flip the Switch On Continuous Delivery
Flip the Switch On Continuous DeliveryFlip the Switch On Continuous Delivery
Flip the Switch On Continuous Delivery
 
Peer Into the Bright Future on the Service Desk Horizon
Peer Into the Bright Future on the Service Desk HorizonPeer Into the Bright Future on the Service Desk Horizon
Peer Into the Bright Future on the Service Desk Horizon
 
Remedyforce helps General Dynamics meet ever-changing user needs
Remedyforce helps General Dynamics meet ever-changing user needsRemedyforce helps General Dynamics meet ever-changing user needs
Remedyforce helps General Dynamics meet ever-changing user needs
 
BMC Software Remedyforce Case Study
BMC Software Remedyforce Case Study BMC Software Remedyforce Case Study
BMC Software Remedyforce Case Study
 
Mission: Launch a Digital Workplace
Mission: Launch a Digital Workplace Mission: Launch a Digital Workplace
Mission: Launch a Digital Workplace
 
Digital Transformation Playbook: Guide to Unleashing Exponential Growth
Digital Transformation Playbook: Guide to Unleashing Exponential GrowthDigital Transformation Playbook: Guide to Unleashing Exponential Growth
Digital Transformation Playbook: Guide to Unleashing Exponential Growth
 
Curating Your Digital Workplace: Key Steps for IT
Curating Your Digital Workplace: Key Steps for ITCurating Your Digital Workplace: Key Steps for IT
Curating Your Digital Workplace: Key Steps for IT
 
Delivering the Digital Workplace Without the Chaos
Delivering the Digital Workplace Without the ChaosDelivering the Digital Workplace Without the Chaos
Delivering the Digital Workplace Without the Chaos
 
How to Manage MLC Costs to Optimize the Mainframe
How to Manage MLC Costs to Optimize the MainframeHow to Manage MLC Costs to Optimize the Mainframe
How to Manage MLC Costs to Optimize the Mainframe
 
BMC Remedyforce vs Other IT Service Management
BMC Remedyforce vs Other IT Service ManagementBMC Remedyforce vs Other IT Service Management
BMC Remedyforce vs Other IT Service Management
 
3. ITIL Categorization for Remedyforce- An Evolution into Service Desk Self S...
3.	ITIL Categorization for Remedyforce- An Evolution into Service Desk Self S...3.	ITIL Categorization for Remedyforce- An Evolution into Service Desk Self S...
3. ITIL Categorization for Remedyforce- An Evolution into Service Desk Self S...
 
Leverage your Salesforce Force.com Platform with Remedyforce Cloud ITSM
Leverage your Salesforce Force.com Platform with Remedyforce Cloud ITSMLeverage your Salesforce Force.com Platform with Remedyforce Cloud ITSM
Leverage your Salesforce Force.com Platform with Remedyforce Cloud ITSM
 
Improving IT Skills the Right Way
Improving IT Skills the Right WayImproving IT Skills the Right Way
Improving IT Skills the Right Way
 
The Human Factor: Five Tips for Creating the Quintessential Hybrid IT Profess...
The Human Factor: Five Tips for Creating the Quintessential Hybrid IT Profess...The Human Factor: Five Tips for Creating the Quintessential Hybrid IT Profess...
The Human Factor: Five Tips for Creating the Quintessential Hybrid IT Profess...
 
What's New in the BMC Remedy Suite
What's New in the BMC Remedy SuiteWhat's New in the BMC Remedy Suite
What's New in the BMC Remedy Suite
 
Sanofi’s Journey to Service Resolution
Sanofi’s Journey to Service ResolutionSanofi’s Journey to Service Resolution
Sanofi’s Journey to Service Resolution
 

Kürzlich hochgeladen

Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Advanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionAdvanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionDilum Bandara
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostZilliz
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenHervé Boutemy
 
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DayH2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DaySri Ambati
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxhariprasad279825
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piececharlottematthew16
 

Kürzlich hochgeladen (20)

Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Advanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionAdvanced Computer Architecture – An Introduction
Advanced Computer Architecture – An Introduction
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache Maven
 
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DayH2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptx
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piece
 

Help your Enterprise Implement Big Data with Control-M for Hadoop

  • 1. © Copyright 1/15/2015 BMC Software, Inc1 Joe Goldberg Solutions Marketing October 2014 Help your Enterprise Implement Big Data with Control-M for Hadoop
  • 2. © Copyright 1/15/2015 BMC Software, Inc3 Who is using Hadoop and why?
  • 3. © Copyright 1/15/2015 BMC Software, Inc4 Who is using Big Data? $ • New Account Risk Screens • Fraud Prevention • Trading Risk • Insurance Underwriting • Accelerate Loan Processing Financial Services • 360° View of the Customer • Analyze Brand Sentiment • Personalized Promotions • Website Optimization • Optimal Store Layout Retail • Call Detail Records (CDRs) • Infrastructure Investment • Next Product to Buy (NPTB) • Real-time Bandwidth Allocation • New Product Development Telecom Healthcare • Genomic data for medical trials • Monitor patient vitals • Reduce re-admittance rates • Store medical research data • Recruiting for pharmaceutical trials Utilities, Oil & Gas • Smart meter stream analysis • Optimize lease bidding • Compliance reporting • Proactive equipment repair • Seismic image processing Public Sector • Analyze public sentiment • Protect critical networks • Prevent fraud and waste • Crowdsourced reporting • Fulfill open records requests
  • 4. © Copyright 1/15/2015 BMC Software, Inc5 Hadoop Lots of Data Traditional
  • 5. © Copyright 1/15/2015 BMC Software, Inc6 The Players
  • 6. © Copyright 1/15/2015 BMC Software, Inc7 Building a (Hadoop) Business Service
  • 7. © Copyright 1/15/2015 BMC Software, Inc8 • Identify Data Sources and Targets • Write code • Test • Deploy • Production The Steps
  • 8. © Copyright 1/15/2015 BMC Software, Inc9
  • 9. © Copyright 1/15/2015 BMC Software, Inc10 SQL Data Source • SQL Server – Write SQL script/Stored Procedure – Learn SQL Agent Job Definition – Write powershell script/bat file – Define job – Run job
  • 10. © Copyright 1/15/2015 BMC Software, Inc11 ETL Data Source • Informatica – Build Informatica workflows – Learn PowerCenter Scheduler – Write Scripts – Build PowerCenter job – Run job
  • 11. © Copyright 1/15/2015 BMC Software, Inc12 Files Data Source • Move files with FTP – Learn FTP tool – Write scripts – Run FTP
  • 12. © Copyright 1/15/2015 BMC Software, Inc13 Run Hadoop Jobs • Oozie/Hue – Write the MapReduce, Pig – Learn Oozie – Write scripts – Build workflows
  • 13. © Copyright 1/15/2015 BMC Software, Inc14 SQL Query #!/usr/bin/sh # Sample pmcmdscript set pagesize 0 linesize 80 feedback off SELECT 'The database ' || instance_name || ' has been running since ' || to_char(startup_time, 'HH24:MI MM/DD/YYYY') FROM v$instance; SELECT 'There are ' || count(status) || ' data files with a status of ' || status FROM dba_data_files GROUP BY status ORDER BY status; SELECT 'The total storage used by the data files is ' || sum(bytes)/1024/1024 || ' MB' FROM dba_data_files; #!/usr/bin/env bash bin=`dirname"$0"` bin=`cd "$bin"; pwd` . "$bin"/../libexec/hadoop-config.sh #set the hadoopcommandand the path to the hadoop jar HADOOP_CMD="${HADOOP_PREFIX}/bin/hadoop --config $HADOOP_CONF_DIR“ #find the hadoop jar HADOOP_JAR='‘ #find under HADOOP_PREFIX (tar ball install) HADOOP_JAR=`find ${HADOOP_PREFIX} -name 'hadoop--*.jar' | head -n1` #if its not found look under /usr/share/hadoop(rpm/deb installs) if [ "$HADOOP_JAR"== '' ]then HADOOP_JAR=`find /usr/share/hadoop-name 'hadoop--*.jar' | head -n1` fi #if it is still empty then dont run the tests if [ "$HADOOP_JAR"== '' ]then echo "Did not find hadoop--*.jarunder '${HADOOP_PREFIX} or '/usr/share/hadoop'" exit 1 fi #dir where to store the data on hdfs. The data is relative of the users home dir on hdfs. PARENT_DIR="validate_deploy_`date+%s`“ TERA_GEN_OUTPUT_DIR="${PARENT_DIR}/tera_gen_data“ TERA_SORT_OUTPUT_DIR="${PARENT_DIR}/tera_sort_data“ Hadoop #!/bin/ksh cd /home/bmcU1ser/ftp_race_source sftp -b /dev/stdin -o Cipher=blowfish -o Compression=yes -o BatchMode=yes -o IdentityFile=/export/home/user/.ssh/id_rsa -o Port=22 bmcUs1ser@hou-hadoop- mstr 1>sftp.log 2>&1 <<ENDSFTP if [ -f /home/bmcU1ser/ftp_race_target/daily_shipment_log]; then exit 1 else put daily_shipment_log/home/bmcU1ser/ftp_race_target fi quit ENDSFTP rc=$? if [[ $rc != 0 ]]; then print "***Erroroccurred...$rc" `date "+%Y-%m-%d-%H.%M.%S"` if [[ -f /home/bmcU1ser/ftp_race_target/daily_shipment_log ]]; then rm /home/bmcU1ser/ftp_race_target/daily_shipment_log fi else mv /home/bmcU1ser/ftp_race_source/daily_shipment_log /home/bmcU1ser/ftp_race_source/old/daily_shipment_log print "***Successful transfer...$rc" `date "+%Y-%m-%d-%H.%M.%S"` fi File TransferInformatica #!/usr/bin/bash # Sample pmcmdscript # Check if the service is alive pmcmd pingservice -sv testService -d testDomain if [ "$?" != 0 ]; then # handle error echo "Could not ping service" exit fi # Get service properties pmcmd getserviceproperties -sv testService -d testDomain if [ "$?" != 0 ]; then # handle error echo "Could not get service properties" exit fi # Get task details for session task "s_testSessionTask" of workflow # "wf_test_workflow" in folder "testFolder" pmcmd gettaskdetails -sv testService -d testDomain -u Administrator -p adminPass -folder testFolder -workflow wf_test_workflow s_testSessionTask if [ "$?" != 0 ]; then # handle error echo "Could not get details for task s_testSessionTask" exit fi Programmers program
  • 14. © Copyright 1/15/2015 BMC Software, Inc15 What happens when this runs? • What is related to what? • Are we on time or late? • What if something fails? – Which program was running? – Where is the output? – How do I fix it? – Can I just rerun it? If so, from the beginning? – Does any cleanup have to be done? – How do I track this problem and the steps taken to resolve the problem? #!/usr/bin/env bash bin=`dirname"$0"` bin=`cd "$bin"; pwd` . "$bin"/../libexec/hadoop-config.sh #set the hadoopcommandand the path to the hadoop jar HADOOP_CMD="${HADOOP_PREFIX}/bin/hadoop --config $HADOOP_CONF_DIR“ #find the hadoop jar HADOOP_JAR='‘ #find under HADOOP_PREFIX (tar ball install) HADOOP_JAR=`find ${HADOOP_PREFIX} -name 'hadoop--*.jar' | head -n1` #if its not found look under /usr/share/hadoop(rpm/deb installs) if [ "$HADOOP_JAR"== '' ]then HADOOP_JAR=`find /usr/share/hadoop-name 'hadoop--*.jar' | head -n1` fi #if it is still empty then dont run the tests if [ "$HADOOP_JAR"== '' ]then echo "Did not find hadoop--*.jarunder '${HADOOP_PREFIX} or '/usr/share/hadoop'" exit 1 fi #dir where to store the data on hdfs. The data is relative of the users home dir on hdfs. PARENT_DIR="validate_deploy_`date+%s`“ TERA_GEN_OUTPUT_DIR="${PARENT_DIR}/tera_gen_data“ TERA_SORT_OUTPUT_DIR="${PARENT_DIR}/tera_sort_data“ Hadoop
  • 15. © Copyright 1/15/2015 BMC Software, Inc16 SQL Query HadoopFile TransferInformatica A Better Way
  • 16. © Copyright 1/15/2015 BMC Software, Inc17 Defining Control-M for Hadoop jobs Set Script parameters Hadoop Program parameters HDFS commands • get • put • rm • move • rename Supports all Apache Distributions (0.x-2.x): • Cloudera • Hortonworks • MapR • Pivotal • BigInsights
  • 17. © Copyright 1/15/2015 BMC Software, Inc18 Building a Hadoop Business Process HDFS Java MapReduce Pig Hive Sqoop File Transfer Informatica DataStage Business Objects Cognos Oracle Sybase SQL Server SSIS PostgreSQL z/OS Linux/Unix/Windows Amazon EC2 / VMware NetBackup / TSM SAP / OEBS / Peoplesoft
  • 18. © Copyright 1/15/2015 BMC Software, Inc19 Connection Profile
  • 19. © Copyright 1/15/2015 BMC Software, Inc20 Monitoring Workflows Resource Manager report
  • 20. © Copyright 1/15/2015 BMC Software, Inc21 Workload Conversion & Discovery
  • 21. © Copyright 1/15/2015 BMC Software, Inc22 BMC Control-M Workload Automation Hadoop Application Developers Write programs Build Hadoop jobs Add Pre/Post Jobs Access for the Business IT Scheduler Pig Hive MapReduce Sqoop HDFS File Watcher
  • 22. © Copyright 1/15/2015 BMC Software, Inc23 And the fun is just beginning…
  • 23. © Copyright 1/15/2015 BMC Software, Inc24 Partner Why BMC Control-M for Hadoop?
  • 24. © Copyright 1/15/2015 BMC Software, Inc25 Key Takeaways  Eliminate scripting with built-in capabilities  Reduce the complexity of building and testing applications  Production Applications run more reliably, are easier to monitor and ensure compliance Big Data and Hadoop are coming to your Data Center  Easily build composite applications leveraging the full power of your technology fabric Build Applications Faster Increase Service Quality Gain Business Agility
  • 25. © Copyright 1/15/2015 BMC Software, Inc26 For Additional Information: www.bmc.com/hadoop
  • 26. © Copyright 1/15/2015 BMC Software, Inc27 Thank You. Be sure to visit Control-M Labs

Hinweis der Redaktion

  1. Every industry segment is already using Big Data. If yours is not in the above list, it’s only because there’s not enough room.
  2. This is a single-slide explanation and justification for Hadoop: Companies have lots of data both conventional structured data and new types. But the fundamental problem is there is more data than can be economically processed with traditional approaches. The traditional approach takes the ENTIRE data set and loads/copies into a relational database or some other file structure and processes it from top to bottom sequentially. This takes a long time and the hardware is VERY expensive. Even if money was no object, and when is THAT ever the case, getting bigger and bigger hardware would still give relatively incremental increases to capacity and speed. Hadoop on the other hand, uses a cluster of cheap (NOT inexpensive but cheap) hardware that is expected to fail and so is disposable. (The cluster technology provides full redundancy so that when (not if) a component fails, the switch to an alternate copy of the data being processed is completely transparent). The data is broken up into as many pieces as you have servers in the cluster (the largest Hadoop clusters have tens of thousands of nodes) and is processed in parallel. It is common for processing time to drop from hours or even days to minutes.
  3. So let’s talk about a project to deliver the first business service to take advantage of Big Data. You’ll see in a moment why (Hadoop) is in brackets.
  4. A business requirement/goal is established and Application Developers start designing. One of the first steps/requirements that we have heard from almost every single customer we’ve talked to is identifying the data that will be processed. Almost without exception, that data includes a whole bunch of traditional data sources like relational databases, data processed by ETL tools, data transferred from various sources and perhaps some new data types like social, web click or sensor data. Then all the other phases of Application Development are performed. This sounds very similar to other projects and in fact, much of what I’m about to discuss applies to just about every application development project.
  5. So this sounds easy, right? Just like the stuff you do all the time. Perhaps not. The pressure to deliver Big Data applications is huge. Many identify Big Data initiatives as key competitive differentiators and time is not your friend. Hadoop and Big Data are relatively new technologies so there are few experienced, seasoned practitioners. According to Gartner, in the next few years, the market will provide only 25% of the required staffing to fill Big Data positions. These factors conspire to make Big Data/Hadoop projects particularly challenging to staff and deliver and the common pitfalls and delays that are all too common among traditional projects make Big Data projects all the more difficult for organizations to pull off successfully. This “piece of cake” can make you very sick.
  6. Let’s examine some of the factors that are among the more challenging problems, especially because if you go down the common path, it becomes really difficult to change later. We’ve identified data sources. Let’s quickly look at how each one if handled within the context of our first Big Data project.
  7. So somebody in AppDev says “I know how to fix this” and scripts a bunch of this stuff, but not all of these tools can be eliminated so you get a bunch of scripts AND a bunch of tools. Testing is done and the application is delivered to Operations to “run this stuff”
  8. So somebody in AppDev says “I know how to fix this” and scripts all this stuff. Testing is done and the application is delivered to Operations to “run this stuff”
  9. There are five Control-M job types: Java MapReduce Pig script Hive Sqoop HDFS File Watcher All five are sub-types of the Hadoop job type. Select “Hadoop” from the Job Palette. Then select the “execution” type. CLICK You can add program and environment parameters for this specific execution CLICK It’s common to have to manipulate files before and after program execution so the Pre/Post Commands enable you to perform HDFS operations via the “Pre Commands” and Post Commands” sections in the job definition. You can also choose whether the success of these Pre/Post actions will affect overall job status or not.
  10. Once you have built your Hadoop jobs, building a flow for either just Hadoop jobs or connecting those Hadoop jobs into an enterprise business process that may include ETL, RDBMS extracts, File Transfers and any other job types/applications that Control-M supports, is a simple, drag and drop process. And when the workflow is finished, if you need to add an SLA or add a backup at the end or start up a VM that may not always be powered up, that too is a simple drag and drop addition of a BIM or Backup or Vmware job.
  11. The Connection Profile simplifies setup by collecting all environment info into a single object that is encrypted and managed by Control-M.
  12. Monitoring Hadoop is just like any other Control-M application with the ability to view job output, perform operational actions like Kill and provide visibility via Self Service.
  13. Control-M now provides huge value and great capabilities through the entire Lifecycle for Hadoop applications. Developers can build Control-M jobs with Workload Change Manager (the simple, web-based self-service job authoring component which you will hear about very shortly), submit requests to Production Control that enables that group to service the request quickly and get it into production and once in operation, provide access to all constituents in the business via Control-M Self Service.
  14. For Big Data, there seems to be a new project almost every day that will requires integration. The challenges I’ve been discussing are destined to be encountered over and over again for these and other technologies.