SlideShare ist ein Scribd-Unternehmen logo
1 von 45
Analytics and Lakehouse for Oracle
Applications…IntegrationOptions Explained
Red Hot
Ray Fevrier
Analytics & Lakehouse Cloud Design Specialist
March 3rd, 2023
Contributors
• Morgan Russell
• Wilbert Poeliejoe
• Anis Zerelli
• Carmine Acanfora
• Alina Stuparu
The following is intended to outline our general product direction. It is intended for
information purposes only, and may not be incorporated into any contract. It is not a
material, code, or functionality, and should not be relied upon in making purchasing decisions.
The development, release, timing, and pricing of any features or functionality described for
change and remains at the sole discretion of Oracle Corporation.
Safe Harbor Statement
Copyright © 2023, Oracle and/or its affiliates | Confidential: Internal
2
Agenda
Modern Data Platform
Fusion and EPM Data Extraction
Best Practices
Tipping our toes into Data Mesh
Specialists Assistance
Modern Data Platform
4 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal
The Modern data platform is an enabler for successful data driven organizations.
Support Modern Data Platforms in OCI
with Lakehouse as the foundation
Red Hot
Jose Cruz
Analytics & Lakehouse Cloud Design Specialist
Leader
January 18th, 2022
OTube Link: https://otube.oracle.com/media/Red+Hot+-
+Support+Modern+Data+Platforms+in+OCI+with+Lakehouse+as+the+foundation/1_sk0c3ty3
Data Ecosystem & Conceptual Architecture
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
6
Data
Decisions
People
Processes
Decisions
generate data
Data influences
decisions
The enabler to support a distributed data economy
The data ecosystem is the overall
enabler for the data economy and it
is key to find and use the
hidden data capital.
The modern data platform is the technology enabler and provides c
apabilities to address several architectural styles used to find, curate a
nd use data capital.
7 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal
Data Lake
 Agile, new data sets may be quickly
onboarded
 Any data, any history, cost effective storage
storage
 “Schema on read” for raw data, quick to add
quick to add new data sets
 Scalable storage, processing, and access
access
 Data repository for analytics, ML, data labs,
data labs, etc.
DataWarehouse
 Consistent, integrated data with data
model and constraints
 All analytical data in single place
 Strong emphasis on data quality and
consistency
 “Schema on write” to conform data from
sources
 SingleVersion of (analytical)Truth
Data Lakehouse
 Best of both worlds
 Combines benefits of Data Lake and Data
Warehouse:
 Agility, scalability, and costs of Data
Lake
 Data governance and SQL access of
DataWarehouse
 Offers close integration between Data
Warehouse and Data Lake
Building a Modern Data Platform with Data Lakehouse
Lakehouse Value Proposition
Oracle Lakehouse Reference Architecture
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
8
The data refinery is
responsible for the
ingestion, orchestration and
partially the transformation
into the persistence
platform.
The persistence platform
stores data in multiple layers
of storage, from economical
scale-out object storage, to
formalized data warehouse
and/or transactional systems.
Data enrichment occurs, data
quality is scrutinized, and the
data definition is formalized
as data moves between the
storage layers.
The access and interpretation
platform includes visualization
capabilities for analytics of
both static and streaming
data.
In addition, the results
of executed machine learning
models are curated and
exposed
for analysis.
Cloud data lake house - process enterprise and streaming data for analysis and machine learning
Overall concepts
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
9
Cloud data lake house - process enterprise and streaming data for analysis and machine learning
Gold(curatedinformation)
This is where the past, current and future
future interpretation of enterprise
informationresides. In this the layer the
layer the data is structured to support
support agile access and navigation.
navigation.
Silver (curateddata)
In this layer that is has usually been
enriched and/or has passed stringent data
quality measures. The data is in an
immutable modelled form and Business
Process neutral.
Bronze(rawdata)
Data is loaded unchangedfrom sources.
Enrichment of the data would promote it
to the Curated data layers. Data in this
layer could be represented logically in the
Curated layers or queried for data science
purposes.
Oracle Lakehouse Reference Architecture
Data Persistency and Processing Layer
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
10
Cloud data lake house - process enterprise and streaming data for analysis and machine learning
Real Time Ingest
Cloud
Storage
Analytics &
Visualization
AI Services
Governance
Streaming
Ingest
Machine
Learning
Bulk
Transfer
Batch
Processing
Batch Ingest
Serving Data
Store
Streaming
Processing
Hadoop
Ecosyste
m
APIs
Streaming
Analytics
Streaming
Analytics
Capabilities and Key Characteristics
 Inclusive of all data types
 Agile in using all data
 Wide range of processing
engines
 Flexible and modular
 Open and interoperable
 Faster innovation
 Highly available and resilient
 Scalable and elastic
 Secure and compliant by design
 Governed and trustable
 Cost efficient
 Adaptable to current needs
 Future proof
 Self serve to deploy and
consume
 Entice data democratization
 Embedded intelligence
Oracle Lakehouse Reference Architecture
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
11
Translytical Modern Data Platform
 What happened with
my business this
quarter?
 How that compares
with last year’s quarter?
 What is the trend of my
sales?
 How can I optimize my
costs and operations?
 How can I leverage
information to drive
business?
 How can I be
prescriptive based on
 How can I infuse
intelligence in my
processes?
 What if I change my
strategy?
Cloud data lake house - process enterprise and streaming data for analysis and machine learning
Oracle Lakehouse Reference Architecture
 Can be deployed in across
public, hybrid and multicloud
 Interoperable with any
based system deployed on
prem or in 3rd party clouds
 Seamless integration with
services in Oracle Cloud,
including SaaS
 Leverage OCI features for
running and operating the
workloads
 Can be extended with OCI
Marketplace
 Can be extended with IaaS
services to address highly
heterogeneous workloads
 Can be part of a larger
Customer workload
 Leverages implicitly all OCI
capabilities
3/10/2023
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal/Restricted/Highly Restricted
12
Compute Storage Networking Oracle
Databases
Open Source
Databases
Operating Systems,
Native VMWare
Developer
Services
Containersand
Functions
Application
Integration
Data Lakehouse
Machine Learning
and AI
Analyticsand BI
Oracle Applications Custom Applications
Global Cloud Datacenter Infrastructure
Public Cloud Regions | Hybrid Cloud: Cloud@Customer, Dedicated Regions, Roving Edge | Multicloud: Azure, AWS
Security | Observability | Compliance | Messaging | Governance
ISV Applications
Modern Data Platform
Part of a wider cloud infrastructure and platform
Fusion and EPM Data Extraction
13 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal
Fusion and EPM (ERPM) Logical Architecture
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
14
A Few Key Points…
 Interaction with EPM Cloud is done via REST
APIs.
 EPM Automate is a REST API wrapper that
allows users to interact with their EPM pods;
however, it needs to be installed on-prem or in
a VM.
 Preferred method to extract data from Fusion is
through Business Intelligence Cloud Connector
(BICC).
 Oracle Analytics Cloud (OAC) is the only BI tool
able to connect directly to both ERP and EPM.
 Data Persistency is highly recommended to
reduce the dependency on the SaaS system,
especially when doing historical analysis. Also
helps with API throttling.
Oracle Analytics Cloud (OAC)
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
15
The Oracle Analytics platform is a cloud native service that
provides the capabilities required to address the entire analytics
process including data ingestion and modeling, data preparation
and enrichment, and visualization and collaboration, without
compromising security and governance.
 Embedded machine learning (ML) and natural language
processing (NLP) technologies help increase productivity and
build an analytics-driven culture in organizations
 Oracle Analytics supports a hybrid deployment strategy (start
on-premises or in the cloud)
 Connectors to dozens of data sources including: Amazon
Redshift, Fusion Apps, EPM Cloud, Microsoft SQL Server,
Snowflake, Autonomous Oracle Database etc.
 Data Flows and Pipelines help move and transform data
across Analytic pipeline stages, built-in to OAC flows
 Data Replicator, a user friendly (vs most data integration)
data replication tool, allows users to copy data from
Fusion to ADW
First choice for OAC centered use cases
Autonomous Data Warehouse (ADW)
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
16
First choice for ADB centered use cases
Autonomous Database (ADB) is a fully automated database service that makes it
easy for organizations to develop and deploy application workloads regardless of
complexity, scale, or criticality. ADB’s Autonomous Data Warehouse (ADW)
supports analytics and warehouse workloads.
 Data Transform (powered by Oracle Data Integrator) provides a powerful ELT
engine that is heavily optimized for Oracle SQL and database utilities
 Analytic Views provides a single version of the truth by creating a single
definition of structure and calculation rules (aggregations, measures) in the
database
 Defines relationships between dimension and fact tables (star schema)
 Defines the business model (sematic layer) and calculations expressions
 Primary use cases include: visualization agnostic, enhance data sets
for OAC, application development using APEX
 Process data from multiple sources/Clouds
 Autonomous Databases (ADW, ATP…)
 Cloud Storage: OCI Object Storage, AWS Simple Storage Service (S3),
Azure Data Lake Storage (ADLS)
 ADW processes data: Flat Files (CSV, Parquet, JSON, Avro, ORC)
Oracle Integration Cloud (OIC)
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
17
A visual application integration and automation tool that
provides prebuilt connectivity to SaaS and on-premises
applications
Benefits
 A run-ready process automation templates with drag and
drop designers
 A low-code visual builder for web and mobile application
development.
 Business insight across end-to-end digital processes
 B2B capabilities for EDI and secure file transfer
 Machine learning recommendations for easy data
mapping
Key Highlights
 Mainly used as an orchestration tool or for application to
application integration
 Files managed by reference up to a maximum size of
1GB
 Ideal when interacting with applications using REST API
Oracle Data Integrator (ODI)
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
18
Next Generation Architecture
“E-L-T”
Transform Transform
Extract Load
ADW Spark Hive HBase Pig Sqoop
BICC
Oracle
Datapump
Oracle
DBLink
JMS
External
Tables
Teradata
SAP Siebel
eBusiness
Suite
IBM DB2 Netezza SCD
Bulk Data
Transformation
Most Apps,
Databases
& Cloud
Bulk Data
Movement
Cloud
DBs
Big
Data
Oracle Data Integrator (ODI) provides data migration with its innovative
extract, load, and transform (E-L-T) technology, that is optimized for
most on-premises and cloud databases.
Original E-LT Innovator
 E-L-T approach allows users to perform transformations at either
source or target
 E-L-T provides a flexible architecture for optimized performance on
any platform
 ODI provides high performance bulk data movement, massively
parallel data transformation using database or big data
technologies, and block-level data loading that leverages native
data utilities
Benefits
 Prebuilt connectors for many databases and technologies
 Fewer network hops which improves performance for loading
 Takes advantage of existing infrastructure
 Pluggable Knowledge Module Architecture
 Available on-prem and OCI MarketPlace
OCI Data Integration (OCI DI)
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
19
First choice for OCI platform-wide use cases
What is it?
OCI DI is a fully-managed, cloud native ETL tool that simplifies complex data extract,
transform, and load processes (ETL/E-LT) into data lakes and warehouses for data
science and analytics with a modern, no-code dataflow designer.
Key Features
 Intuitive, no-code, graphical user interface that makes it easier to design and
manage data flows.
 Easy-to-use graphic designer that automatically generates execution code and
provides visualize dataflow prior to load
 Pay As You Go pricing which makes OCI DI cost effective (no need to invest in
new servers and software)
 Reusable templates and dataflows
 Data flow validation
 Native integration with Oracle Cloud Infrastructure and SaaS
 Rule-based design protects from schema drift by handling schema changes
dynamically
 Innovative Optimizer for Spark ETL and pushdown E-LT
Fusion Data Extract - All Options Explored
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
20
Using BICC and OAC Replicator
The OAC Replicator allows native connectivity to Fusion SaaS
1. OAC calls BICC to request Public View Objects (PVOs) extracts
to Object Storage
2. BICC extracts the PVOs to Object Storage
3. OAC will read the BICC extracts from Object Storage and load
them to ADW
Using BICC and ODI MP or OCI DI
The ODI Marketplace and OCI Data Integration both have a native
connector to extract data from Fusion SaaS. These connectors use
BICC to extract data into Object Storage.
1. DI tool calls BICC to request PVOs extracts to Object Storage
2. DI tool leverages the target database technology for BICC extract
data transformations and ingestion into ADW.
Using ADW Self-Service E-L-T
Note: ADBs can ingest files from Object Storage or from local system.
Overview of Business Intelligence Cloud Connector (BICC)
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
21
About BICC
 BICC is the preferred method to extract business intelligence and other data in
bulk and load it into designated external storage areas.
 BICC is available as part of the Oracle Applications Cloud subscription.
 User provisioning to perform tasks related to data extraction is done in BICC.
Key Features of BICC
 Configure an external storage location that works for your data needs.
 Extract complete or partial data. You can select offerings or specific objects.
 Run extracts on-demand or schedule them to run at specified intervals during the
day, in a week, or throughout the month.
 Run incremental extracts if you need only the data that changed since your last
extract.
 Schedule multiple independent extracts at convenient intervals.
 Monitor extracts and review logs.
 Export configured offerings and associated data stores.
 Manage refresh metadata and specify dates for incremental refresh comparison.
First choice for Fusion centered use cases
Load and transform data from Oracle Fusion Cloud Applications
to build a data lake or data warehouse
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
22
Loading and transforming data
from Fusion into Object Storage
(data lake) in Parquet format and
in Autonomous Data Warehouse
using OCI Data Integration.
Load and transform data from Oracle Fusion Cloud Applications to build a data lake or data warehouse
EPM Cloud Data Extract - All Options Explored
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
23
EPM Automate and ODI Marketplace
We can use EPM Automate and ODI MP to:
1. After installing EPM Automate and ODI MP in a VM.
2. ODI can orchestrate the a REST call (through EPM Automate) to
EPM, extract the data then download the file locally.
3. ODI can then load the data into ADW.
Oracle Integration Cloud or OCI Data Integration
Both OIC and OCI DI can make REST calls directly to EPM and perform
the following tasks:
1. Extract the data from EPM Cloud into EPM Outbox folder in a zip file.
2. Copy to the zip file from the Outbox folder to Object Storage
3. Unzip the file
4. Load the data into ADW
Using ADW Self-Service E-L-T
Note: ADBs can ingest files from Object Storage or from local system.
About REST API for Oracle EPM Cloud
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
24
REST APIs allow service administrators and infrastructure consultants to perform
administration tasks in EPM Cloud.
EPM Cloud services with REST API capabilities:
 Planning (PLN)
 FreeForm (FF)
 Planning Modules
 Account Reconciliation (ARCS)
 Financial Consolidation and Close (FCC)
 Enterprise Profitability and Cost Management (EPCM)
 Tax Reporting (TR)
 Strategic Workforce Planning (SWP)
 Narrative Reporting (NR)
 Oracle Enterprise Data Management Cloud (EDMC)
 Data Management (DM)
Users can programmatically interact with EPM Cloud using:
 REST APIs
 EPM Automate Utility
 Groovy business rules (see Oracle Enterprise Performance Management Cloud Groovy
Rules Java API Reference)
EPM Automate
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
25
What is it?
EPM Automate is a command line utility that enables users to remotely interact
with their EPM applications and perform various repeatable tasks, such as:
 Import and export metadata and data
 Upload and download files,
 Run Data Management integrations
 Etc.
Key points regarding EPM Automate:
 Most interaction with EPM Automate is via a scripting language (python, bash,
batch…)
 Integrate with other scripts (such as Python) for end-to-end processes
 Orchestration, NOT an ETL tool
 You can create scripts that are capable of completing a wide array of tasks
and automate their execution using a scheduler.
 EPM Automate would need to be installed in a VM on OCI
Sample Usage: Copy a file to an Object Storage bucket:
 epmautomate copyToObjectStorage SOURCE_FILE_NAME USERNAME PASSWORD URL
 epmautomate copyToObjectStorage example_file.txt oracleidentitycloudservice/jDoe
example_pwd https://swiftobjectstorage.us-ashburn-
1.oraclecloud.com/v1/axaxnpcrorw5/bucket-20210301-1359
EPM Cloud Data Replication into FAW’s ADW Using ODI Marketplace
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
26
Data Flow for this sample architecture
1. An EPM Extract will be created in EPM Cloud
2. EPM Automate and ODI MP will be installed in the same Linux
VM
3. ODI will call EPM Automate to login to EPM and execute the
extract job from EPM Cloud to store it in EPM Cloud as a ZIP file
format (containing mutliple CSV files)
4. ODI will call EPM automate to download the EPM Extract ZIP file
and decompress the EPM CSV extract files, merge them into one
single CSV and load them in ADW
5. All further transformations (unpivoting, aggregation, lookups, soft
deletes,...) will happen inside the ADW as subsequent ODI
processes (not documented as part of the scope of this blog).
6. EPM will be configured in such a way that EPM data can be
extracted on the convenient schedule
7. Semantic model will be extended by customizing an existing
subject area or addition of a new subject area to allow reporting
on EPM data extracted in ADW, as per the CEAL Team blog
available here.
Reference Architecture - EPM Cloud Data Replication into FAW ADW : Using ODI Marketplace
EPM to ADW: OIC Flow
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
27
Features
 Scheduled to run via cron expression or ad-hoc
 ADW bulk load for improved performance
 Wait for the completion of the EPM jobs
 Files managed by reference up to a maximum size of 1GB
Components
 Integrations: export data, export metadata
 Connections: EPM (REST API), ADW
Comprehensive Data Integration in Oracle Cloud
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
28
Focus /
When to Use:
Oracle ERP /
SaaS
Social & 3rd
Party Apps
Ingest
Data Prep &
Transforms
Analytics /
Warehouse
OAC Data
Prep/Replicator
Analytics Cloud focus, not for
IT/Enterprise DI  Salesforce
  
ADB Data Integrator
Autonomous DB focus, for ELT
processing in ADB  SAP ABAP
  ADW
OCI Data Integration
Obj Store and Data Science
focused ETL/ELT    
OCI Data Flow
Serverless Spark, use with OCI-DI
or for bespoke ETL  data science
OCI Streaming
Message based ingestion
using Kafka APIs  event based
OCI GoldenGate &
Stream Analytics
CDC, Replication &
Streaming ETL + Analytics * SAP ECC DB
  
Oracle Integration
Cloud
As MDW supplement
for Social / SaaS Feeds    message payloads
Data Integrator
(Not an OCI Native application but is
available via Marketplace)
Innovator and leader
of ELT data processing  SAP ABAP
 
Oracle &
3rd Party
Integrations
* Limited availability and requires GVP approval
Best Practices
29 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal
OAC Replicator vs Oracle Data Integrator (ODI)
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
30
• No additional cost for OAC
• End-to-end process (Fusion -> Object Storage -> OAC -> ADW)
• User friendly (vs most data integration tool)
• Little to no data transformation needed
• Data Movements are happening within the Oracle Network (Fusion pushes
the data to Object Storage)
• Supports incremental extracts
• PaaS component, limited to no additional maintenance (automated patching)
• Allows user to schedule tasks
• Data flows through OAC instance
• Not recommended for high data volumes
• Target has to be an Oracle DB in the Cloud (on-prem DB on the roadmap)
• Limited orchestration and error handling capabilities
• Notifications only on the BICC side
• Limited source systems
Option A: OAC Replicator
• Enterprise Data integration platform for a more strategic approach to DWH
• High performance for bulk load
• Data transformation happens at the same time as the load
• Supports the use of process design through workflows and email
notifications and scheduling
• Native integration with Fusion SaaS and Oracle Databases (ADB, DBaaS,
on-prem DBs…) and many other source systems.
• Native Integration with 50+ target data store (Cloud or on-prem), including
Azure SQL, Snowflake
• Data Movements are happening within the Oracle Network when using
Object Storage as the external storage for BICC Extracts (Fusion SaaS
pushes the data to Object Storage and target is in OCI)
• Reads BICC Extracts data from Object Storage
• Supports incremental extracts
• Provisions in OCI MarketPlace, require VM.Standard2.4 shape or higher VM
Image
• Less intuitive, complex configuration, product knowledge required.
• IaaS components, requires planning/resources for patching and upgrades
Option B: ODI MarketPlace
OAC Replicator vs OCI Data Integration (OCI DI)
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
31
• No additional cost for OAC
• End-to-end process (Fusion -> Object Storage -> OAC -> ADW)
• User friendly (vs most data integration tool)
• Little to no data transformation needed
• Data Movements are happening within the Oracle Network (Fusion pushes
the data to Object Storage)
• Supports incremental extracts
• PaaS component, limited to no additional maintenance (automated patching)
• Allows user to schedule tasks
• Data flows through OAC instance
• Not recommended for high data volumes
• Target has to be an Oracle DB in the Cloud (on-prem DB on the roadmap)
• Limited orchestration and error handling capabilities
• Notifications only on the BICC side
• Limited source systems
Option A: OAC Replicator
• OCI Data Integration (OCI DI) has a fully-managed serverless ETL
architecture that reduces maintenance efforts and auto scales to manage
unpredictable data workloads
• Innovate faster by simplifying integration into data warehouses and data
lakes to make data-driven decisions
• Integrate more easily with a graphical, no-code user experience and preview
data to see the results of a transformation
• Rules-based integration enables ETL developers and data engineers to
handle schema evolutions
• Discover and connect quickly to popular databases, data lakes and
applications which allows you to quickly prepare data sets for data science
projects
• Hybrid execution powered by Spark ETL or push-down E-LT
• Store and process data on the same network reduces data latency and
improves business performance
• Complements other OCI data and AI services (OCI Data Catalog, OCI Data
Flow, OCI Data Science) to manage data lakes
• Only supports Oracle Cloud Infrastructure data management
• Launched in 2020, may have limited features
Option B: OCI DI
Fusion Data Extraction Guidelines from Oracle A-Team
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
32
 Explore available data extract/export options and choose the right one based on your requirements.
 Understand the supported objects and volume/size limitations from pillar (HCM / CX / ERP / SCM)
documentation based on your extract option.
 Consider the mode of integration (batch or real-time), type of integration (synchronous or asynchronous),
frequency (daily, weekly, monthly, …), data volume (# of records to be retrieved/processed), expected file size,
storage/retrieval options, duration, and performance while choosing the extract option.
 Document the data extract requirements and design the solution by considering usage of data, automation, and
performance.
 Do not use OTBI for data extract. Note that OTBI is a reporting tool and not recommended for synchronous
integrations. OTBI analysis has limitations on 25,000 records for exporting to Excel.
 Avoid developing BIP reports using custom SQLs for data extracts and integration requirements. Consider
performance, timeout, file size, and formatting if you are developing custom BIP reports.
 BICC is the recommended option of extracting bulk data for ERP, SCM, CX in batch out of Fusion Cloud
Applications for external applications/data warehouse/reporting. BICC supports incremental extract. For HCM
BICC is used only with Fusion Analytics Warehouse and HCM analytics.
Data Extraction Options and Guidelines for Oracle Fusion Applications Suite
 Consider Security, compliance, data privacy, encryption requirements while designing the data extraction/replication solution.
 Leverage Oracle Cloud Infrastructure Cloud (OCI) and Oracle Integration Cloud (OIC) capabilities for orchestrating and automating the data extracts and integrations.
 Avoid using REST API / SOAP services for extracting/exporting high-volume data set from Oracle Fusion Cloud Application. REST API / SOAP services are recommended only for
real-time integrations.
 Always refer to the latest documentation to get updates on features.
 Research cloud customer connect (https://community.oracle.com/customerconnect/)if you find gaps and create ideas as needed.
 Log a Service Request with Oracle if any of the above extract options don't meet your needs to get the right guidance.
Implementation Best Practices for EPM Cloud REST APIs
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
33
Use the implementation best practices listed in this topic when working with the EPM Cloud
REST APIs.
Best practices:
• Before using the REST APIs, complete the prerequisites.
• Use the correct authentication, as described in OAuth 2 and Basic Authentication for EPM
Cloud REST APIs.
• Understand the URL structure.
• Know how to get the current REST API version.
• Review the sample scenarios to get started quickly.
• Be aware of REST API compatibility.
• Use the Quick Reference to find all of the Oracle Enterprise Performance Management
Cloud REST APIs at a glance.
• Use API development tools (i.e. Postman…)
Best Practices
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
34
Leverage the best set of services for the specific use case – use as needed
 OCI has an extensive set of services and capabilities,
continuously being improved, that can address any
workload; use as needed, to address the specific use case
 Leverage serverless services as much as possible (less
operational overhead, increased chance of success)
 Leverage a just enough approach (don’t simplify, don’t over
complicate)
 Data storage should be decided based on use case (for
telemetry in Object Storage, financial data in ADW)
 Lakehouse data should be organized leveraging a
medallion architecture (bronze, silver, gold) or similar for
governance and security
 For schema on write data understand what data
modelling works best for the use case; traditionally a star
good option as it is widely supported by OCI services
 Leverage data ingestion and processing engines that
better support requirements (for timely data feeds leverage
CDC and real time, for massive data processing leverage Spark
scale out processing,…)
 Don’t use a single use case pattern to address all
the law of the hammer as what works for a use case might not work
for another)
Tipping our toes into Data Mesh
35 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal
Exploring the art of possible
Unravel Data Mesh
Future of Decentralized Information Management
RED HOT
Jakub ILLNER
Analytics & Lakehouse Cloud Design Specialist Leader
Technology Cloud Engineering, EMEA
19 January 2023
10.03.2023
Copyright © 2023, Oracle and/or its affiliates | Confidential: Internal
36
OTube Link: https://otube.oracle.com/media/1_b82m570m
10.03.2023
Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal
37
Data Mesh is a decentralized, organizational
and architectural* approach to share and
manage analytical data in complex and large-
scale environments – within and across
organizations.
From “Data Mesh – Delivering Data-Driven Value at Scale” by Zhamak Dehghani, published in March 2022 by O’Reilly
* Zhamak Dehghani used term “Sociotechnical”, instead of “Organizational and Architectural”.
Data Mesh Principles
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
38
• Enterprise Data integration platform for a more strategic approach to DWH
• High performance for bulk load
• Data transformation happens at the same time as the load
• Supports the use of process design through workflows and email
notifications and scheduling
• Native integration with Fusion SaaS and Oracle Databases (ADB, DBaaS,
on-prem DBs…) and many other source systems.
• Native Integration with 50+ target data store (Cloud or on-prem), including
Azure SQL, Snowflake
• Data Movements are happening within the Oracle Network when using
Characteristics of Usable Data Products
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
39
Discoverable
Data must be easily
discoverable, e.g., by using
company wide data catalog
with all data products and
related metadata
information.
Addressable
Data must be reachable via
long lasting unique address,
using common addressing
conventions for polyglot data
types.
Understandable
Data product semantics,
structures, and technical
features must be described
in a form that is easily
accessible and
understandable.
Trustworthy
Data product guarantees
service level objectives
(SLOs) that include change
frequency, timeliness, shape,
granularity, quality,
completeness, etc.
Natively Accessible
Data must be accessible
over formats suitable to
customer requirements
(stream, file, table, API, etc.),
possible by using polyglot
stores.
Interoperable
Data products must comply
with harmonization rules that
allow correlation across
domains (identifiers, schema
compatibility, metadata, etc.).
Valuable on its Own
Data product must be
valuable and meaningful to
business; technical objects
should be hidden from users
of data products.
Secure
Access to data product must
be protected via access
control. Data must be
encrypted and protected
according to its
classification.
Examples of Data Products on OCI
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
40
Streaming database events as data
products
 OCI GoldenGate to ingest database
transactions
 OCI Streaming as event store
 OCI GoldenGate Stream Analytics to
curate and transform data streams
 OCI Object Storage for archive of
events (for auditing, logging, re-
processing)
Publishing data from Fusion Apps as
data products
 OCI Data Integration invokes BICC,
which exports data to OCI Object
Storage
 OCI Data Integration transforms and
aggregates BICC data into target
dimensional data model
 Autonomous Data Warehouse
manages data and provides query
services
eBS
JDE
Psoft
…
Specialists Assistance
How Cloud Solution Specialists can assist with Lakehouse workloads?
41 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal
Cloud Solution Specialists – Analytics & Lakehouse Design
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
42
We are here to assist you and your Customer on their journey to OCI
 We provide assistance throughout
the complete sales cycle with
discrete activities
 We partner and collaborate with
Product Management to create
assets to be reused with
Customers
 We engage and help close Oracle
Data Platform & Lakehouse
workloads
 We leverage years of expertise in
Analytics & Lakehouse and
combine it with best practices to
bring value to your Customers
Check out our confluence page for
more information and reusable assets
1. Workload
Qualification
2. Workload
Validation
3. Workload
Confirmation
4. Adoption 5. Consuming
Workshop
Workload
Solution Assistance
Architecture Design
Healthcheck
Workload Architecture
Solution Assistance
Lift Implementation Design
Healthcheck
Workload Architecture
Useful URLs
Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal
43
Oracle Analytics Cloud (OAC)
 Oracle Analytics Cloud - Get Started: https://docs.oracle.com/en/cloud/paas/analytics-cloud/index.html
 Oracle Analytics Cloud and Server Roadmap: https://www.oracle.com/business-analytics/cloud-and-server-roadmap.html
 What’s New for Oracle Analytics Cloud: https://docs.oracle.com/en/cloud/paas/analytics-cloud/acswn/index.html#ACSWN-GUID-CFF90F44-BCEB-49EE-B40B-
8D040F02D476
 Create Services with Oracle Analytics Cloud: https://docs.oracle.com/en/cloud/paas/analytics-cloud/acoci/create-services.html#GUID-47022452-65CC-4345-
8F7F-A447BB24A48A
 SaaS Data Replication in Oracle Analytics Cloud (OAC): https://www.ateam-oracle.com/saas-data-replication-in-oracle-analytics-cloud-oac-and-oaac
Fusion Cloud
 Creating a Business Intelligence Cloud Extract: https://docs.oracle.com/en/cloud/saas/applications-common/21b/biacc/get-started.html#get-started
 OCI Data Integration (ODI DI) Help Center: https://docs.oracle.com/en-us/iaas/data-integration/using/index.htm
 Load and transform data from Oracle Fusion Cloud Applications to build a data lake or data warehouse: https://blogs.oracle.com/cloud-
infrastructure/post/load-and-transform-data-from-oracle-fusion-cloud-applications-to-build-a-data-lake-or-data-warehouse
Enterprise Performance Management (EPM)
 EPM Automate - Copying a Snapshot to or from Oracle Object Storage: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-
common/cepma/sample_script_15_object_store.html
 REST API for Oracle EPM: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-common/prest/index.html
 Working with EPM Automate for Oracle EPM: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-common/cepma/index.html
Our mission is to help people see
data in new ways, discover insights,
unlock endless possibilities.

Weitere ähnliche Inhalte

Was ist angesagt?

Building a modern data warehouse
Building a modern data warehouseBuilding a modern data warehouse
Building a modern data warehouseJames Serra
 
Migration to Databricks - On-prem HDFS.pptx
Migration to Databricks - On-prem HDFS.pptxMigration to Databricks - On-prem HDFS.pptx
Migration to Databricks - On-prem HDFS.pptxKshitija(KJ) Gupte
 
Making the Case for Integration Platform as a Service (iPaaS)
Making the Case for Integration Platform as a Service (iPaaS)Making the Case for Integration Platform as a Service (iPaaS)
Making the Case for Integration Platform as a Service (iPaaS)Axway
 
IT Simplification And Modernization PowerPoint Presentation Slides
IT Simplification And Modernization PowerPoint Presentation SlidesIT Simplification And Modernization PowerPoint Presentation Slides
IT Simplification And Modernization PowerPoint Presentation SlidesSlideTeam
 
Hadoop Migration to databricks cloud project plan.pptx
Hadoop Migration to databricks cloud project plan.pptxHadoop Migration to databricks cloud project plan.pptx
Hadoop Migration to databricks cloud project plan.pptxyashodhannn
 
Future of Data Engineering
Future of Data EngineeringFuture of Data Engineering
Future of Data EngineeringC4Media
 
Data Vault Vs Data Lake
Data Vault Vs Data LakeData Vault Vs Data Lake
Data Vault Vs Data LakeCalum Miller
 
Slides: Taking an Active Approach to Data Governance
Slides: Taking an Active Approach to Data GovernanceSlides: Taking an Active Approach to Data Governance
Slides: Taking an Active Approach to Data GovernanceDATAVERSITY
 
Data Mesh using Microsoft Fabric
Data Mesh using Microsoft FabricData Mesh using Microsoft Fabric
Data Mesh using Microsoft FabricNathan Bijnens
 
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...DataScienceConferenc1
 
Data Lakehouse Symposium | Day 4
Data Lakehouse Symposium | Day 4Data Lakehouse Symposium | Day 4
Data Lakehouse Symposium | Day 4Databricks
 
Introduction SQL Analytics on Lakehouse Architecture
Introduction SQL Analytics on Lakehouse ArchitectureIntroduction SQL Analytics on Lakehouse Architecture
Introduction SQL Analytics on Lakehouse ArchitectureDatabricks
 
Making Data Timelier and More Reliable with Lakehouse Technology
Making Data Timelier and More Reliable with Lakehouse TechnologyMaking Data Timelier and More Reliable with Lakehouse Technology
Making Data Timelier and More Reliable with Lakehouse TechnologyMatei Zaharia
 
Building a Logical Data Fabric using Data Virtualization (ASEAN)
Building a Logical Data Fabric using Data Virtualization (ASEAN)Building a Logical Data Fabric using Data Virtualization (ASEAN)
Building a Logical Data Fabric using Data Virtualization (ASEAN)Denodo
 
Optimizing SaaS Productivity for CEOs, CFOs & CIOs with LeanIX's CEO
Optimizing SaaS Productivity for CEOs, CFOs & CIOs with LeanIX's CEOOptimizing SaaS Productivity for CEOs, CFOs & CIOs with LeanIX's CEO
Optimizing SaaS Productivity for CEOs, CFOs & CIOs with LeanIX's CEOsaastr
 
Modernize & Automate Analytics Data Pipelines
Modernize & Automate Analytics Data PipelinesModernize & Automate Analytics Data Pipelines
Modernize & Automate Analytics Data PipelinesCarole Gunst
 

Was ist angesagt? (20)

Building a modern data warehouse
Building a modern data warehouseBuilding a modern data warehouse
Building a modern data warehouse
 
Oracle Cloud Infrastructure
Oracle Cloud InfrastructureOracle Cloud Infrastructure
Oracle Cloud Infrastructure
 
Migration to Databricks - On-prem HDFS.pptx
Migration to Databricks - On-prem HDFS.pptxMigration to Databricks - On-prem HDFS.pptx
Migration to Databricks - On-prem HDFS.pptx
 
Making the Case for Integration Platform as a Service (iPaaS)
Making the Case for Integration Platform as a Service (iPaaS)Making the Case for Integration Platform as a Service (iPaaS)
Making the Case for Integration Platform as a Service (iPaaS)
 
IT Simplification And Modernization PowerPoint Presentation Slides
IT Simplification And Modernization PowerPoint Presentation SlidesIT Simplification And Modernization PowerPoint Presentation Slides
IT Simplification And Modernization PowerPoint Presentation Slides
 
Hadoop Migration to databricks cloud project plan.pptx
Hadoop Migration to databricks cloud project plan.pptxHadoop Migration to databricks cloud project plan.pptx
Hadoop Migration to databricks cloud project plan.pptx
 
Future of Data Engineering
Future of Data EngineeringFuture of Data Engineering
Future of Data Engineering
 
Architecting a datalake
Architecting a datalakeArchitecting a datalake
Architecting a datalake
 
Data Vault Vs Data Lake
Data Vault Vs Data LakeData Vault Vs Data Lake
Data Vault Vs Data Lake
 
Slides: Taking an Active Approach to Data Governance
Slides: Taking an Active Approach to Data GovernanceSlides: Taking an Active Approach to Data Governance
Slides: Taking an Active Approach to Data Governance
 
Data Mesh using Microsoft Fabric
Data Mesh using Microsoft FabricData Mesh using Microsoft Fabric
Data Mesh using Microsoft Fabric
 
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
 
Data Lakehouse Symposium | Day 4
Data Lakehouse Symposium | Day 4Data Lakehouse Symposium | Day 4
Data Lakehouse Symposium | Day 4
 
Data Migration to Azure
Data Migration to AzureData Migration to Azure
Data Migration to Azure
 
Introduction SQL Analytics on Lakehouse Architecture
Introduction SQL Analytics on Lakehouse ArchitectureIntroduction SQL Analytics on Lakehouse Architecture
Introduction SQL Analytics on Lakehouse Architecture
 
Making Data Timelier and More Reliable with Lakehouse Technology
Making Data Timelier and More Reliable with Lakehouse TechnologyMaking Data Timelier and More Reliable with Lakehouse Technology
Making Data Timelier and More Reliable with Lakehouse Technology
 
Building a Logical Data Fabric using Data Virtualization (ASEAN)
Building a Logical Data Fabric using Data Virtualization (ASEAN)Building a Logical Data Fabric using Data Virtualization (ASEAN)
Building a Logical Data Fabric using Data Virtualization (ASEAN)
 
Snowflake Datawarehouse Architecturing
Snowflake Datawarehouse ArchitecturingSnowflake Datawarehouse Architecturing
Snowflake Datawarehouse Architecturing
 
Optimizing SaaS Productivity for CEOs, CFOs & CIOs with LeanIX's CEO
Optimizing SaaS Productivity for CEOs, CFOs & CIOs with LeanIX's CEOOptimizing SaaS Productivity for CEOs, CFOs & CIOs with LeanIX's CEO
Optimizing SaaS Productivity for CEOs, CFOs & CIOs with LeanIX's CEO
 
Modernize & Automate Analytics Data Pipelines
Modernize & Automate Analytics Data PipelinesModernize & Automate Analytics Data Pipelines
Modernize & Automate Analytics Data Pipelines
 

Ähnlich wie Analytics and Lakehouse for Oracle Applications Integration Options Explained

Oracle Data Integration - Overview
Oracle Data Integration - OverviewOracle Data Integration - Overview
Oracle Data Integration - OverviewJeffrey T. Pollock
 
Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)
Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)
Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)Denodo
 
techbrief-enterprisedatameshandgoldengate.pdf
techbrief-enterprisedatameshandgoldengate.pdftechbrief-enterprisedatameshandgoldengate.pdf
techbrief-enterprisedatameshandgoldengate.pdfaliramezani30
 
Data Modernization_Harinath Susairaj.pptx
Data Modernization_Harinath Susairaj.pptxData Modernization_Harinath Susairaj.pptx
Data Modernization_Harinath Susairaj.pptxArunPandiyan890855
 
5 Steps for Architecting a Data Lake
5 Steps for Architecting a Data Lake5 Steps for Architecting a Data Lake
5 Steps for Architecting a Data LakeMetroStar
 
ADB Deployment options_082021.pptx
ADB Deployment options_082021.pptxADB Deployment options_082021.pptx
ADB Deployment options_082021.pptxAhmed Abdellatif
 
Equinix Big Data Platform and Cassandra - A view into the journey
Equinix Big Data Platform and Cassandra - A view into the journeyEquinix Big Data Platform and Cassandra - A view into the journey
Equinix Big Data Platform and Cassandra - A view into the journeyPraveen Kumar
 
What_to_expect_from_oracle_database_12c
What_to_expect_from_oracle_database_12cWhat_to_expect_from_oracle_database_12c
What_to_expect_from_oracle_database_12cMaria Colgan
 
Azure Days 2019: Business Intelligence auf Azure (Marco Amhof & Yves Mauron)
Azure Days 2019: Business Intelligence auf Azure (Marco Amhof & Yves Mauron)Azure Days 2019: Business Intelligence auf Azure (Marco Amhof & Yves Mauron)
Azure Days 2019: Business Intelligence auf Azure (Marco Amhof & Yves Mauron)Trivadis
 
Azure BI Cloud Architectural Guidelines.pdf
Azure BI Cloud Architectural Guidelines.pdfAzure BI Cloud Architectural Guidelines.pdf
Azure BI Cloud Architectural Guidelines.pdfpbonillo1
 
Modern Data Management for Federal Modernization
Modern Data Management for Federal ModernizationModern Data Management for Federal Modernization
Modern Data Management for Federal ModernizationDenodo
 
Webinar future dataintegration-datamesh-and-goldengatekafka
Webinar future dataintegration-datamesh-and-goldengatekafkaWebinar future dataintegration-datamesh-and-goldengatekafka
Webinar future dataintegration-datamesh-and-goldengatekafkaJeffrey T. Pollock
 
Azure Data.pptx
Azure Data.pptxAzure Data.pptx
Azure Data.pptxFedoRam1
 
Data Integration for Big Data (OOW 2016, Co-Presented With Oracle)
Data Integration for Big Data (OOW 2016, Co-Presented With Oracle)Data Integration for Big Data (OOW 2016, Co-Presented With Oracle)
Data Integration for Big Data (OOW 2016, Co-Presented With Oracle)Rittman Analytics
 
The Open Data Lake Platform Brief - Data Sheets | Whitepaper
The Open Data Lake Platform Brief - Data Sheets | WhitepaperThe Open Data Lake Platform Brief - Data Sheets | Whitepaper
The Open Data Lake Platform Brief - Data Sheets | WhitepaperVasu S
 
SphereEx pitch deck
SphereEx pitch deckSphereEx pitch deck
SphereEx pitch deckTech in Asia
 
Unlocking Big Data Silos in the Enterprise or the Cloud (Con7877)
Unlocking Big Data Silos in the Enterprise or the Cloud (Con7877)Unlocking Big Data Silos in the Enterprise or the Cloud (Con7877)
Unlocking Big Data Silos in the Enterprise or the Cloud (Con7877)Jeffrey T. Pollock
 
Cassandra Day SV 2014: Apache Cassandra at Equinix for High Performance, Scal...
Cassandra Day SV 2014: Apache Cassandra at Equinix for High Performance, Scal...Cassandra Day SV 2014: Apache Cassandra at Equinix for High Performance, Scal...
Cassandra Day SV 2014: Apache Cassandra at Equinix for High Performance, Scal...DataStax Academy
 
Azure Data Factory ETL Patterns in the Cloud
Azure Data Factory ETL Patterns in the CloudAzure Data Factory ETL Patterns in the Cloud
Azure Data Factory ETL Patterns in the CloudMark Kromer
 

Ähnlich wie Analytics and Lakehouse for Oracle Applications Integration Options Explained (20)

Oracle Data Integration - Overview
Oracle Data Integration - OverviewOracle Data Integration - Overview
Oracle Data Integration - Overview
 
Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)
Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)
Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)
 
techbrief-enterprisedatameshandgoldengate.pdf
techbrief-enterprisedatameshandgoldengate.pdftechbrief-enterprisedatameshandgoldengate.pdf
techbrief-enterprisedatameshandgoldengate.pdf
 
Data Modernization_Harinath Susairaj.pptx
Data Modernization_Harinath Susairaj.pptxData Modernization_Harinath Susairaj.pptx
Data Modernization_Harinath Susairaj.pptx
 
5 Steps for Architecting a Data Lake
5 Steps for Architecting a Data Lake5 Steps for Architecting a Data Lake
5 Steps for Architecting a Data Lake
 
ADB Deployment options_082021.pptx
ADB Deployment options_082021.pptxADB Deployment options_082021.pptx
ADB Deployment options_082021.pptx
 
Equinix Big Data Platform and Cassandra - A view into the journey
Equinix Big Data Platform and Cassandra - A view into the journeyEquinix Big Data Platform and Cassandra - A view into the journey
Equinix Big Data Platform and Cassandra - A view into the journey
 
Benefits of a data lake
Benefits of a data lake Benefits of a data lake
Benefits of a data lake
 
What_to_expect_from_oracle_database_12c
What_to_expect_from_oracle_database_12cWhat_to_expect_from_oracle_database_12c
What_to_expect_from_oracle_database_12c
 
Azure Days 2019: Business Intelligence auf Azure (Marco Amhof & Yves Mauron)
Azure Days 2019: Business Intelligence auf Azure (Marco Amhof & Yves Mauron)Azure Days 2019: Business Intelligence auf Azure (Marco Amhof & Yves Mauron)
Azure Days 2019: Business Intelligence auf Azure (Marco Amhof & Yves Mauron)
 
Azure BI Cloud Architectural Guidelines.pdf
Azure BI Cloud Architectural Guidelines.pdfAzure BI Cloud Architectural Guidelines.pdf
Azure BI Cloud Architectural Guidelines.pdf
 
Modern Data Management for Federal Modernization
Modern Data Management for Federal ModernizationModern Data Management for Federal Modernization
Modern Data Management for Federal Modernization
 
Webinar future dataintegration-datamesh-and-goldengatekafka
Webinar future dataintegration-datamesh-and-goldengatekafkaWebinar future dataintegration-datamesh-and-goldengatekafka
Webinar future dataintegration-datamesh-and-goldengatekafka
 
Azure Data.pptx
Azure Data.pptxAzure Data.pptx
Azure Data.pptx
 
Data Integration for Big Data (OOW 2016, Co-Presented With Oracle)
Data Integration for Big Data (OOW 2016, Co-Presented With Oracle)Data Integration for Big Data (OOW 2016, Co-Presented With Oracle)
Data Integration for Big Data (OOW 2016, Co-Presented With Oracle)
 
The Open Data Lake Platform Brief - Data Sheets | Whitepaper
The Open Data Lake Platform Brief - Data Sheets | WhitepaperThe Open Data Lake Platform Brief - Data Sheets | Whitepaper
The Open Data Lake Platform Brief - Data Sheets | Whitepaper
 
SphereEx pitch deck
SphereEx pitch deckSphereEx pitch deck
SphereEx pitch deck
 
Unlocking Big Data Silos in the Enterprise or the Cloud (Con7877)
Unlocking Big Data Silos in the Enterprise or the Cloud (Con7877)Unlocking Big Data Silos in the Enterprise or the Cloud (Con7877)
Unlocking Big Data Silos in the Enterprise or the Cloud (Con7877)
 
Cassandra Day SV 2014: Apache Cassandra at Equinix for High Performance, Scal...
Cassandra Day SV 2014: Apache Cassandra at Equinix for High Performance, Scal...Cassandra Day SV 2014: Apache Cassandra at Equinix for High Performance, Scal...
Cassandra Day SV 2014: Apache Cassandra at Equinix for High Performance, Scal...
 
Azure Data Factory ETL Patterns in the Cloud
Azure Data Factory ETL Patterns in the CloudAzure Data Factory ETL Patterns in the Cloud
Azure Data Factory ETL Patterns in the Cloud
 

Mehr von Ray Février

Targeted Marketing: How Marketing Companies can use Big Data to Target Custom...
Targeted Marketing: How Marketing Companies can use Big Data to Target Custom...Targeted Marketing: How Marketing Companies can use Big Data to Target Custom...
Targeted Marketing: How Marketing Companies can use Big Data to Target Custom...Ray Février
 
Strengthen your Foundations
Strengthen your FoundationsStrengthen your Foundations
Strengthen your FoundationsRay Février
 
FDMEE 11.1.2.4.200 Partner Meeting - May 2016
FDMEE 11.1.2.4.200 Partner Meeting - May 2016FDMEE 11.1.2.4.200 Partner Meeting - May 2016
FDMEE 11.1.2.4.200 Partner Meeting - May 2016Ray Février
 
Loading Smartlists into PBCS using FDMEE
Loading Smartlists into PBCS using FDMEELoading Smartlists into PBCS using FDMEE
Loading Smartlists into PBCS using FDMEERay Février
 
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532Ray Février
 
HPCM Management Ledger & FDMEE: The Perfect Partnership?
HPCM Management Ledger & FDMEE: The Perfect Partnership?HPCM Management Ledger & FDMEE: The Perfect Partnership?
HPCM Management Ledger & FDMEE: The Perfect Partnership?Ray Février
 

Mehr von Ray Février (6)

Targeted Marketing: How Marketing Companies can use Big Data to Target Custom...
Targeted Marketing: How Marketing Companies can use Big Data to Target Custom...Targeted Marketing: How Marketing Companies can use Big Data to Target Custom...
Targeted Marketing: How Marketing Companies can use Big Data to Target Custom...
 
Strengthen your Foundations
Strengthen your FoundationsStrengthen your Foundations
Strengthen your Foundations
 
FDMEE 11.1.2.4.200 Partner Meeting - May 2016
FDMEE 11.1.2.4.200 Partner Meeting - May 2016FDMEE 11.1.2.4.200 Partner Meeting - May 2016
FDMEE 11.1.2.4.200 Partner Meeting - May 2016
 
Loading Smartlists into PBCS using FDMEE
Loading Smartlists into PBCS using FDMEELoading Smartlists into PBCS using FDMEE
Loading Smartlists into PBCS using FDMEE
 
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532
 
HPCM Management Ledger & FDMEE: The Perfect Partnership?
HPCM Management Ledger & FDMEE: The Perfect Partnership?HPCM Management Ledger & FDMEE: The Perfect Partnership?
HPCM Management Ledger & FDMEE: The Perfect Partnership?
 

Kürzlich hochgeladen

RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998YohFuh
 
Predicting Employee Churn: A Data-Driven Approach Project Presentation
Predicting Employee Churn: A Data-Driven Approach Project PresentationPredicting Employee Churn: A Data-Driven Approach Project Presentation
Predicting Employee Churn: A Data-Driven Approach Project PresentationBoston Institute of Analytics
 
Log Analysis using OSSEC sasoasasasas.pptx
Log Analysis using OSSEC sasoasasasas.pptxLog Analysis using OSSEC sasoasasasas.pptx
Log Analysis using OSSEC sasoasasasas.pptxJohnnyPlasten
 
Customer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxCustomer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxEmmanuel Dauda
 
VIP Call Girls in Amravati Aarohi 8250192130 Independent Escort Service Amravati
VIP Call Girls in Amravati Aarohi 8250192130 Independent Escort Service AmravatiVIP Call Girls in Amravati Aarohi 8250192130 Independent Escort Service Amravati
VIP Call Girls in Amravati Aarohi 8250192130 Independent Escort Service AmravatiSuhani Kapoor
 
04242024_CCC TUG_Joins and Relationships
04242024_CCC TUG_Joins and Relationships04242024_CCC TUG_Joins and Relationships
04242024_CCC TUG_Joins and Relationshipsccctableauusergroup
 
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.pptdokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.pptSonatrach
 
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPramod Kumar Srivastava
 
Industrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfIndustrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfLars Albertsson
 
Brighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingBrighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingNeil Barnes
 
定制英国白金汉大学毕业证(UCB毕业证书) 成绩单原版一比一
定制英国白金汉大学毕业证(UCB毕业证书)																			成绩单原版一比一定制英国白金汉大学毕业证(UCB毕业证书)																			成绩单原版一比一
定制英国白金汉大学毕业证(UCB毕业证书) 成绩单原版一比一ffjhghh
 
Aminabad Call Girl Agent 9548273370 , Call Girls Service Lucknow
Aminabad Call Girl Agent 9548273370 , Call Girls Service LucknowAminabad Call Girl Agent 9548273370 , Call Girls Service Lucknow
Aminabad Call Girl Agent 9548273370 , Call Girls Service Lucknowmakika9823
 
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdfMarket Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdfRachmat Ramadhan H
 
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Callshivangimorya083
 
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝soniya singh
 
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Callshivangimorya083
 
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service Bhilai
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service BhilaiLow Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service Bhilai
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service BhilaiSuhani Kapoor
 
Unveiling Insights: The Role of a Data Analyst
Unveiling Insights: The Role of a Data AnalystUnveiling Insights: The Role of a Data Analyst
Unveiling Insights: The Role of a Data AnalystSamantha Rae Coolbeth
 

Kürzlich hochgeladen (20)

RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998
 
Predicting Employee Churn: A Data-Driven Approach Project Presentation
Predicting Employee Churn: A Data-Driven Approach Project PresentationPredicting Employee Churn: A Data-Driven Approach Project Presentation
Predicting Employee Churn: A Data-Driven Approach Project Presentation
 
Log Analysis using OSSEC sasoasasasas.pptx
Log Analysis using OSSEC sasoasasasas.pptxLog Analysis using OSSEC sasoasasasas.pptx
Log Analysis using OSSEC sasoasasasas.pptx
 
E-Commerce Order PredictionShraddha Kamble.pptx
E-Commerce Order PredictionShraddha Kamble.pptxE-Commerce Order PredictionShraddha Kamble.pptx
E-Commerce Order PredictionShraddha Kamble.pptx
 
Customer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxCustomer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptx
 
VIP Call Girls in Amravati Aarohi 8250192130 Independent Escort Service Amravati
VIP Call Girls in Amravati Aarohi 8250192130 Independent Escort Service AmravatiVIP Call Girls in Amravati Aarohi 8250192130 Independent Escort Service Amravati
VIP Call Girls in Amravati Aarohi 8250192130 Independent Escort Service Amravati
 
04242024_CCC TUG_Joins and Relationships
04242024_CCC TUG_Joins and Relationships04242024_CCC TUG_Joins and Relationships
04242024_CCC TUG_Joins and Relationships
 
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.pptdokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
 
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
 
Industrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfIndustrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdf
 
Brighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingBrighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data Storytelling
 
定制英国白金汉大学毕业证(UCB毕业证书) 成绩单原版一比一
定制英国白金汉大学毕业证(UCB毕业证书)																			成绩单原版一比一定制英国白金汉大学毕业证(UCB毕业证书)																			成绩单原版一比一
定制英国白金汉大学毕业证(UCB毕业证书) 成绩单原版一比一
 
Aminabad Call Girl Agent 9548273370 , Call Girls Service Lucknow
Aminabad Call Girl Agent 9548273370 , Call Girls Service LucknowAminabad Call Girl Agent 9548273370 , Call Girls Service Lucknow
Aminabad Call Girl Agent 9548273370 , Call Girls Service Lucknow
 
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in Kishangarh
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in  KishangarhDelhi 99530 vip 56974 Genuine Escort Service Call Girls in  Kishangarh
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in Kishangarh
 
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdfMarket Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
 
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
 
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
 
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call
 
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service Bhilai
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service BhilaiLow Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service Bhilai
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service Bhilai
 
Unveiling Insights: The Role of a Data Analyst
Unveiling Insights: The Role of a Data AnalystUnveiling Insights: The Role of a Data Analyst
Unveiling Insights: The Role of a Data Analyst
 

Analytics and Lakehouse for Oracle Applications Integration Options Explained

  • 1. Analytics and Lakehouse for Oracle Applications…IntegrationOptions Explained Red Hot Ray Fevrier Analytics & Lakehouse Cloud Design Specialist March 3rd, 2023 Contributors • Morgan Russell • Wilbert Poeliejoe • Anis Zerelli • Carmine Acanfora • Alina Stuparu
  • 2. The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, timing, and pricing of any features or functionality described for change and remains at the sole discretion of Oracle Corporation. Safe Harbor Statement Copyright © 2023, Oracle and/or its affiliates | Confidential: Internal 2
  • 3. Agenda Modern Data Platform Fusion and EPM Data Extraction Best Practices Tipping our toes into Data Mesh Specialists Assistance
  • 4. Modern Data Platform 4 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal The Modern data platform is an enabler for successful data driven organizations.
  • 5. Support Modern Data Platforms in OCI with Lakehouse as the foundation Red Hot Jose Cruz Analytics & Lakehouse Cloud Design Specialist Leader January 18th, 2022 OTube Link: https://otube.oracle.com/media/Red+Hot+- +Support+Modern+Data+Platforms+in+OCI+with+Lakehouse+as+the+foundation/1_sk0c3ty3
  • 6. Data Ecosystem & Conceptual Architecture Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 6 Data Decisions People Processes Decisions generate data Data influences decisions The enabler to support a distributed data economy The data ecosystem is the overall enabler for the data economy and it is key to find and use the hidden data capital. The modern data platform is the technology enabler and provides c apabilities to address several architectural styles used to find, curate a nd use data capital.
  • 7. 7 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal Data Lake  Agile, new data sets may be quickly onboarded  Any data, any history, cost effective storage storage  “Schema on read” for raw data, quick to add quick to add new data sets  Scalable storage, processing, and access access  Data repository for analytics, ML, data labs, data labs, etc. DataWarehouse  Consistent, integrated data with data model and constraints  All analytical data in single place  Strong emphasis on data quality and consistency  “Schema on write” to conform data from sources  SingleVersion of (analytical)Truth Data Lakehouse  Best of both worlds  Combines benefits of Data Lake and Data Warehouse:  Agility, scalability, and costs of Data Lake  Data governance and SQL access of DataWarehouse  Offers close integration between Data Warehouse and Data Lake Building a Modern Data Platform with Data Lakehouse Lakehouse Value Proposition
  • 8. Oracle Lakehouse Reference Architecture Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 8 The data refinery is responsible for the ingestion, orchestration and partially the transformation into the persistence platform. The persistence platform stores data in multiple layers of storage, from economical scale-out object storage, to formalized data warehouse and/or transactional systems. Data enrichment occurs, data quality is scrutinized, and the data definition is formalized as data moves between the storage layers. The access and interpretation platform includes visualization capabilities for analytics of both static and streaming data. In addition, the results of executed machine learning models are curated and exposed for analysis. Cloud data lake house - process enterprise and streaming data for analysis and machine learning Overall concepts
  • 9. Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 9 Cloud data lake house - process enterprise and streaming data for analysis and machine learning Gold(curatedinformation) This is where the past, current and future future interpretation of enterprise informationresides. In this the layer the layer the data is structured to support support agile access and navigation. navigation. Silver (curateddata) In this layer that is has usually been enriched and/or has passed stringent data quality measures. The data is in an immutable modelled form and Business Process neutral. Bronze(rawdata) Data is loaded unchangedfrom sources. Enrichment of the data would promote it to the Curated data layers. Data in this layer could be represented logically in the Curated layers or queried for data science purposes. Oracle Lakehouse Reference Architecture Data Persistency and Processing Layer
  • 10. Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 10 Cloud data lake house - process enterprise and streaming data for analysis and machine learning Real Time Ingest Cloud Storage Analytics & Visualization AI Services Governance Streaming Ingest Machine Learning Bulk Transfer Batch Processing Batch Ingest Serving Data Store Streaming Processing Hadoop Ecosyste m APIs Streaming Analytics Streaming Analytics Capabilities and Key Characteristics  Inclusive of all data types  Agile in using all data  Wide range of processing engines  Flexible and modular  Open and interoperable  Faster innovation  Highly available and resilient  Scalable and elastic  Secure and compliant by design  Governed and trustable  Cost efficient  Adaptable to current needs  Future proof  Self serve to deploy and consume  Entice data democratization  Embedded intelligence Oracle Lakehouse Reference Architecture
  • 11. Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 11 Translytical Modern Data Platform  What happened with my business this quarter?  How that compares with last year’s quarter?  What is the trend of my sales?  How can I optimize my costs and operations?  How can I leverage information to drive business?  How can I be prescriptive based on  How can I infuse intelligence in my processes?  What if I change my strategy? Cloud data lake house - process enterprise and streaming data for analysis and machine learning Oracle Lakehouse Reference Architecture
  • 12.  Can be deployed in across public, hybrid and multicloud  Interoperable with any based system deployed on prem or in 3rd party clouds  Seamless integration with services in Oracle Cloud, including SaaS  Leverage OCI features for running and operating the workloads  Can be extended with OCI Marketplace  Can be extended with IaaS services to address highly heterogeneous workloads  Can be part of a larger Customer workload  Leverages implicitly all OCI capabilities 3/10/2023 Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal/Restricted/Highly Restricted 12 Compute Storage Networking Oracle Databases Open Source Databases Operating Systems, Native VMWare Developer Services Containersand Functions Application Integration Data Lakehouse Machine Learning and AI Analyticsand BI Oracle Applications Custom Applications Global Cloud Datacenter Infrastructure Public Cloud Regions | Hybrid Cloud: Cloud@Customer, Dedicated Regions, Roving Edge | Multicloud: Azure, AWS Security | Observability | Compliance | Messaging | Governance ISV Applications Modern Data Platform Part of a wider cloud infrastructure and platform
  • 13. Fusion and EPM Data Extraction 13 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal
  • 14. Fusion and EPM (ERPM) Logical Architecture Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 14 A Few Key Points…  Interaction with EPM Cloud is done via REST APIs.  EPM Automate is a REST API wrapper that allows users to interact with their EPM pods; however, it needs to be installed on-prem or in a VM.  Preferred method to extract data from Fusion is through Business Intelligence Cloud Connector (BICC).  Oracle Analytics Cloud (OAC) is the only BI tool able to connect directly to both ERP and EPM.  Data Persistency is highly recommended to reduce the dependency on the SaaS system, especially when doing historical analysis. Also helps with API throttling.
  • 15. Oracle Analytics Cloud (OAC) Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 15 The Oracle Analytics platform is a cloud native service that provides the capabilities required to address the entire analytics process including data ingestion and modeling, data preparation and enrichment, and visualization and collaboration, without compromising security and governance.  Embedded machine learning (ML) and natural language processing (NLP) technologies help increase productivity and build an analytics-driven culture in organizations  Oracle Analytics supports a hybrid deployment strategy (start on-premises or in the cloud)  Connectors to dozens of data sources including: Amazon Redshift, Fusion Apps, EPM Cloud, Microsoft SQL Server, Snowflake, Autonomous Oracle Database etc.  Data Flows and Pipelines help move and transform data across Analytic pipeline stages, built-in to OAC flows  Data Replicator, a user friendly (vs most data integration) data replication tool, allows users to copy data from Fusion to ADW First choice for OAC centered use cases
  • 16. Autonomous Data Warehouse (ADW) Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 16 First choice for ADB centered use cases Autonomous Database (ADB) is a fully automated database service that makes it easy for organizations to develop and deploy application workloads regardless of complexity, scale, or criticality. ADB’s Autonomous Data Warehouse (ADW) supports analytics and warehouse workloads.  Data Transform (powered by Oracle Data Integrator) provides a powerful ELT engine that is heavily optimized for Oracle SQL and database utilities  Analytic Views provides a single version of the truth by creating a single definition of structure and calculation rules (aggregations, measures) in the database  Defines relationships between dimension and fact tables (star schema)  Defines the business model (sematic layer) and calculations expressions  Primary use cases include: visualization agnostic, enhance data sets for OAC, application development using APEX  Process data from multiple sources/Clouds  Autonomous Databases (ADW, ATP…)  Cloud Storage: OCI Object Storage, AWS Simple Storage Service (S3), Azure Data Lake Storage (ADLS)  ADW processes data: Flat Files (CSV, Parquet, JSON, Avro, ORC)
  • 17. Oracle Integration Cloud (OIC) Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 17 A visual application integration and automation tool that provides prebuilt connectivity to SaaS and on-premises applications Benefits  A run-ready process automation templates with drag and drop designers  A low-code visual builder for web and mobile application development.  Business insight across end-to-end digital processes  B2B capabilities for EDI and secure file transfer  Machine learning recommendations for easy data mapping Key Highlights  Mainly used as an orchestration tool or for application to application integration  Files managed by reference up to a maximum size of 1GB  Ideal when interacting with applications using REST API
  • 18. Oracle Data Integrator (ODI) Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 18 Next Generation Architecture “E-L-T” Transform Transform Extract Load ADW Spark Hive HBase Pig Sqoop BICC Oracle Datapump Oracle DBLink JMS External Tables Teradata SAP Siebel eBusiness Suite IBM DB2 Netezza SCD Bulk Data Transformation Most Apps, Databases & Cloud Bulk Data Movement Cloud DBs Big Data Oracle Data Integrator (ODI) provides data migration with its innovative extract, load, and transform (E-L-T) technology, that is optimized for most on-premises and cloud databases. Original E-LT Innovator  E-L-T approach allows users to perform transformations at either source or target  E-L-T provides a flexible architecture for optimized performance on any platform  ODI provides high performance bulk data movement, massively parallel data transformation using database or big data technologies, and block-level data loading that leverages native data utilities Benefits  Prebuilt connectors for many databases and technologies  Fewer network hops which improves performance for loading  Takes advantage of existing infrastructure  Pluggable Knowledge Module Architecture  Available on-prem and OCI MarketPlace
  • 19. OCI Data Integration (OCI DI) Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 19 First choice for OCI platform-wide use cases What is it? OCI DI is a fully-managed, cloud native ETL tool that simplifies complex data extract, transform, and load processes (ETL/E-LT) into data lakes and warehouses for data science and analytics with a modern, no-code dataflow designer. Key Features  Intuitive, no-code, graphical user interface that makes it easier to design and manage data flows.  Easy-to-use graphic designer that automatically generates execution code and provides visualize dataflow prior to load  Pay As You Go pricing which makes OCI DI cost effective (no need to invest in new servers and software)  Reusable templates and dataflows  Data flow validation  Native integration with Oracle Cloud Infrastructure and SaaS  Rule-based design protects from schema drift by handling schema changes dynamically  Innovative Optimizer for Spark ETL and pushdown E-LT
  • 20. Fusion Data Extract - All Options Explored Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 20 Using BICC and OAC Replicator The OAC Replicator allows native connectivity to Fusion SaaS 1. OAC calls BICC to request Public View Objects (PVOs) extracts to Object Storage 2. BICC extracts the PVOs to Object Storage 3. OAC will read the BICC extracts from Object Storage and load them to ADW Using BICC and ODI MP or OCI DI The ODI Marketplace and OCI Data Integration both have a native connector to extract data from Fusion SaaS. These connectors use BICC to extract data into Object Storage. 1. DI tool calls BICC to request PVOs extracts to Object Storage 2. DI tool leverages the target database technology for BICC extract data transformations and ingestion into ADW. Using ADW Self-Service E-L-T Note: ADBs can ingest files from Object Storage or from local system.
  • 21. Overview of Business Intelligence Cloud Connector (BICC) Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 21 About BICC  BICC is the preferred method to extract business intelligence and other data in bulk and load it into designated external storage areas.  BICC is available as part of the Oracle Applications Cloud subscription.  User provisioning to perform tasks related to data extraction is done in BICC. Key Features of BICC  Configure an external storage location that works for your data needs.  Extract complete or partial data. You can select offerings or specific objects.  Run extracts on-demand or schedule them to run at specified intervals during the day, in a week, or throughout the month.  Run incremental extracts if you need only the data that changed since your last extract.  Schedule multiple independent extracts at convenient intervals.  Monitor extracts and review logs.  Export configured offerings and associated data stores.  Manage refresh metadata and specify dates for incremental refresh comparison. First choice for Fusion centered use cases
  • 22. Load and transform data from Oracle Fusion Cloud Applications to build a data lake or data warehouse Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 22 Loading and transforming data from Fusion into Object Storage (data lake) in Parquet format and in Autonomous Data Warehouse using OCI Data Integration. Load and transform data from Oracle Fusion Cloud Applications to build a data lake or data warehouse
  • 23. EPM Cloud Data Extract - All Options Explored Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 23 EPM Automate and ODI Marketplace We can use EPM Automate and ODI MP to: 1. After installing EPM Automate and ODI MP in a VM. 2. ODI can orchestrate the a REST call (through EPM Automate) to EPM, extract the data then download the file locally. 3. ODI can then load the data into ADW. Oracle Integration Cloud or OCI Data Integration Both OIC and OCI DI can make REST calls directly to EPM and perform the following tasks: 1. Extract the data from EPM Cloud into EPM Outbox folder in a zip file. 2. Copy to the zip file from the Outbox folder to Object Storage 3. Unzip the file 4. Load the data into ADW Using ADW Self-Service E-L-T Note: ADBs can ingest files from Object Storage or from local system.
  • 24. About REST API for Oracle EPM Cloud Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 24 REST APIs allow service administrators and infrastructure consultants to perform administration tasks in EPM Cloud. EPM Cloud services with REST API capabilities:  Planning (PLN)  FreeForm (FF)  Planning Modules  Account Reconciliation (ARCS)  Financial Consolidation and Close (FCC)  Enterprise Profitability and Cost Management (EPCM)  Tax Reporting (TR)  Strategic Workforce Planning (SWP)  Narrative Reporting (NR)  Oracle Enterprise Data Management Cloud (EDMC)  Data Management (DM) Users can programmatically interact with EPM Cloud using:  REST APIs  EPM Automate Utility  Groovy business rules (see Oracle Enterprise Performance Management Cloud Groovy Rules Java API Reference)
  • 25. EPM Automate Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 25 What is it? EPM Automate is a command line utility that enables users to remotely interact with their EPM applications and perform various repeatable tasks, such as:  Import and export metadata and data  Upload and download files,  Run Data Management integrations  Etc. Key points regarding EPM Automate:  Most interaction with EPM Automate is via a scripting language (python, bash, batch…)  Integrate with other scripts (such as Python) for end-to-end processes  Orchestration, NOT an ETL tool  You can create scripts that are capable of completing a wide array of tasks and automate their execution using a scheduler.  EPM Automate would need to be installed in a VM on OCI Sample Usage: Copy a file to an Object Storage bucket:  epmautomate copyToObjectStorage SOURCE_FILE_NAME USERNAME PASSWORD URL  epmautomate copyToObjectStorage example_file.txt oracleidentitycloudservice/jDoe example_pwd https://swiftobjectstorage.us-ashburn- 1.oraclecloud.com/v1/axaxnpcrorw5/bucket-20210301-1359
  • 26. EPM Cloud Data Replication into FAW’s ADW Using ODI Marketplace Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 26 Data Flow for this sample architecture 1. An EPM Extract will be created in EPM Cloud 2. EPM Automate and ODI MP will be installed in the same Linux VM 3. ODI will call EPM Automate to login to EPM and execute the extract job from EPM Cloud to store it in EPM Cloud as a ZIP file format (containing mutliple CSV files) 4. ODI will call EPM automate to download the EPM Extract ZIP file and decompress the EPM CSV extract files, merge them into one single CSV and load them in ADW 5. All further transformations (unpivoting, aggregation, lookups, soft deletes,...) will happen inside the ADW as subsequent ODI processes (not documented as part of the scope of this blog). 6. EPM will be configured in such a way that EPM data can be extracted on the convenient schedule 7. Semantic model will be extended by customizing an existing subject area or addition of a new subject area to allow reporting on EPM data extracted in ADW, as per the CEAL Team blog available here. Reference Architecture - EPM Cloud Data Replication into FAW ADW : Using ODI Marketplace
  • 27. EPM to ADW: OIC Flow Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 27 Features  Scheduled to run via cron expression or ad-hoc  ADW bulk load for improved performance  Wait for the completion of the EPM jobs  Files managed by reference up to a maximum size of 1GB Components  Integrations: export data, export metadata  Connections: EPM (REST API), ADW
  • 28. Comprehensive Data Integration in Oracle Cloud Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 28 Focus / When to Use: Oracle ERP / SaaS Social & 3rd Party Apps Ingest Data Prep & Transforms Analytics / Warehouse OAC Data Prep/Replicator Analytics Cloud focus, not for IT/Enterprise DI  Salesforce    ADB Data Integrator Autonomous DB focus, for ELT processing in ADB  SAP ABAP   ADW OCI Data Integration Obj Store and Data Science focused ETL/ELT     OCI Data Flow Serverless Spark, use with OCI-DI or for bespoke ETL  data science OCI Streaming Message based ingestion using Kafka APIs  event based OCI GoldenGate & Stream Analytics CDC, Replication & Streaming ETL + Analytics * SAP ECC DB    Oracle Integration Cloud As MDW supplement for Social / SaaS Feeds    message payloads Data Integrator (Not an OCI Native application but is available via Marketplace) Innovator and leader of ELT data processing  SAP ABAP   Oracle & 3rd Party Integrations * Limited availability and requires GVP approval
  • 29. Best Practices 29 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal
  • 30. OAC Replicator vs Oracle Data Integrator (ODI) Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 30 • No additional cost for OAC • End-to-end process (Fusion -> Object Storage -> OAC -> ADW) • User friendly (vs most data integration tool) • Little to no data transformation needed • Data Movements are happening within the Oracle Network (Fusion pushes the data to Object Storage) • Supports incremental extracts • PaaS component, limited to no additional maintenance (automated patching) • Allows user to schedule tasks • Data flows through OAC instance • Not recommended for high data volumes • Target has to be an Oracle DB in the Cloud (on-prem DB on the roadmap) • Limited orchestration and error handling capabilities • Notifications only on the BICC side • Limited source systems Option A: OAC Replicator • Enterprise Data integration platform for a more strategic approach to DWH • High performance for bulk load • Data transformation happens at the same time as the load • Supports the use of process design through workflows and email notifications and scheduling • Native integration with Fusion SaaS and Oracle Databases (ADB, DBaaS, on-prem DBs…) and many other source systems. • Native Integration with 50+ target data store (Cloud or on-prem), including Azure SQL, Snowflake • Data Movements are happening within the Oracle Network when using Object Storage as the external storage for BICC Extracts (Fusion SaaS pushes the data to Object Storage and target is in OCI) • Reads BICC Extracts data from Object Storage • Supports incremental extracts • Provisions in OCI MarketPlace, require VM.Standard2.4 shape or higher VM Image • Less intuitive, complex configuration, product knowledge required. • IaaS components, requires planning/resources for patching and upgrades Option B: ODI MarketPlace
  • 31. OAC Replicator vs OCI Data Integration (OCI DI) Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 31 • No additional cost for OAC • End-to-end process (Fusion -> Object Storage -> OAC -> ADW) • User friendly (vs most data integration tool) • Little to no data transformation needed • Data Movements are happening within the Oracle Network (Fusion pushes the data to Object Storage) • Supports incremental extracts • PaaS component, limited to no additional maintenance (automated patching) • Allows user to schedule tasks • Data flows through OAC instance • Not recommended for high data volumes • Target has to be an Oracle DB in the Cloud (on-prem DB on the roadmap) • Limited orchestration and error handling capabilities • Notifications only on the BICC side • Limited source systems Option A: OAC Replicator • OCI Data Integration (OCI DI) has a fully-managed serverless ETL architecture that reduces maintenance efforts and auto scales to manage unpredictable data workloads • Innovate faster by simplifying integration into data warehouses and data lakes to make data-driven decisions • Integrate more easily with a graphical, no-code user experience and preview data to see the results of a transformation • Rules-based integration enables ETL developers and data engineers to handle schema evolutions • Discover and connect quickly to popular databases, data lakes and applications which allows you to quickly prepare data sets for data science projects • Hybrid execution powered by Spark ETL or push-down E-LT • Store and process data on the same network reduces data latency and improves business performance • Complements other OCI data and AI services (OCI Data Catalog, OCI Data Flow, OCI Data Science) to manage data lakes • Only supports Oracle Cloud Infrastructure data management • Launched in 2020, may have limited features Option B: OCI DI
  • 32. Fusion Data Extraction Guidelines from Oracle A-Team Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 32  Explore available data extract/export options and choose the right one based on your requirements.  Understand the supported objects and volume/size limitations from pillar (HCM / CX / ERP / SCM) documentation based on your extract option.  Consider the mode of integration (batch or real-time), type of integration (synchronous or asynchronous), frequency (daily, weekly, monthly, …), data volume (# of records to be retrieved/processed), expected file size, storage/retrieval options, duration, and performance while choosing the extract option.  Document the data extract requirements and design the solution by considering usage of data, automation, and performance.  Do not use OTBI for data extract. Note that OTBI is a reporting tool and not recommended for synchronous integrations. OTBI analysis has limitations on 25,000 records for exporting to Excel.  Avoid developing BIP reports using custom SQLs for data extracts and integration requirements. Consider performance, timeout, file size, and formatting if you are developing custom BIP reports.  BICC is the recommended option of extracting bulk data for ERP, SCM, CX in batch out of Fusion Cloud Applications for external applications/data warehouse/reporting. BICC supports incremental extract. For HCM BICC is used only with Fusion Analytics Warehouse and HCM analytics. Data Extraction Options and Guidelines for Oracle Fusion Applications Suite  Consider Security, compliance, data privacy, encryption requirements while designing the data extraction/replication solution.  Leverage Oracle Cloud Infrastructure Cloud (OCI) and Oracle Integration Cloud (OIC) capabilities for orchestrating and automating the data extracts and integrations.  Avoid using REST API / SOAP services for extracting/exporting high-volume data set from Oracle Fusion Cloud Application. REST API / SOAP services are recommended only for real-time integrations.  Always refer to the latest documentation to get updates on features.  Research cloud customer connect (https://community.oracle.com/customerconnect/)if you find gaps and create ideas as needed.  Log a Service Request with Oracle if any of the above extract options don't meet your needs to get the right guidance.
  • 33. Implementation Best Practices for EPM Cloud REST APIs Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 33 Use the implementation best practices listed in this topic when working with the EPM Cloud REST APIs. Best practices: • Before using the REST APIs, complete the prerequisites. • Use the correct authentication, as described in OAuth 2 and Basic Authentication for EPM Cloud REST APIs. • Understand the URL structure. • Know how to get the current REST API version. • Review the sample scenarios to get started quickly. • Be aware of REST API compatibility. • Use the Quick Reference to find all of the Oracle Enterprise Performance Management Cloud REST APIs at a glance. • Use API development tools (i.e. Postman…)
  • 34. Best Practices Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 34 Leverage the best set of services for the specific use case – use as needed  OCI has an extensive set of services and capabilities, continuously being improved, that can address any workload; use as needed, to address the specific use case  Leverage serverless services as much as possible (less operational overhead, increased chance of success)  Leverage a just enough approach (don’t simplify, don’t over complicate)  Data storage should be decided based on use case (for telemetry in Object Storage, financial data in ADW)  Lakehouse data should be organized leveraging a medallion architecture (bronze, silver, gold) or similar for governance and security  For schema on write data understand what data modelling works best for the use case; traditionally a star good option as it is widely supported by OCI services  Leverage data ingestion and processing engines that better support requirements (for timely data feeds leverage CDC and real time, for massive data processing leverage Spark scale out processing,…)  Don’t use a single use case pattern to address all the law of the hammer as what works for a use case might not work for another)
  • 35. Tipping our toes into Data Mesh 35 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal Exploring the art of possible
  • 36. Unravel Data Mesh Future of Decentralized Information Management RED HOT Jakub ILLNER Analytics & Lakehouse Cloud Design Specialist Leader Technology Cloud Engineering, EMEA 19 January 2023 10.03.2023 Copyright © 2023, Oracle and/or its affiliates | Confidential: Internal 36 OTube Link: https://otube.oracle.com/media/1_b82m570m
  • 37. 10.03.2023 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal 37 Data Mesh is a decentralized, organizational and architectural* approach to share and manage analytical data in complex and large- scale environments – within and across organizations. From “Data Mesh – Delivering Data-Driven Value at Scale” by Zhamak Dehghani, published in March 2022 by O’Reilly * Zhamak Dehghani used term “Sociotechnical”, instead of “Organizational and Architectural”.
  • 38. Data Mesh Principles Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 38 • Enterprise Data integration platform for a more strategic approach to DWH • High performance for bulk load • Data transformation happens at the same time as the load • Supports the use of process design through workflows and email notifications and scheduling • Native integration with Fusion SaaS and Oracle Databases (ADB, DBaaS, on-prem DBs…) and many other source systems. • Native Integration with 50+ target data store (Cloud or on-prem), including Azure SQL, Snowflake • Data Movements are happening within the Oracle Network when using
  • 39. Characteristics of Usable Data Products Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 39 Discoverable Data must be easily discoverable, e.g., by using company wide data catalog with all data products and related metadata information. Addressable Data must be reachable via long lasting unique address, using common addressing conventions for polyglot data types. Understandable Data product semantics, structures, and technical features must be described in a form that is easily accessible and understandable. Trustworthy Data product guarantees service level objectives (SLOs) that include change frequency, timeliness, shape, granularity, quality, completeness, etc. Natively Accessible Data must be accessible over formats suitable to customer requirements (stream, file, table, API, etc.), possible by using polyglot stores. Interoperable Data products must comply with harmonization rules that allow correlation across domains (identifiers, schema compatibility, metadata, etc.). Valuable on its Own Data product must be valuable and meaningful to business; technical objects should be hidden from users of data products. Secure Access to data product must be protected via access control. Data must be encrypted and protected according to its classification.
  • 40. Examples of Data Products on OCI Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 40 Streaming database events as data products  OCI GoldenGate to ingest database transactions  OCI Streaming as event store  OCI GoldenGate Stream Analytics to curate and transform data streams  OCI Object Storage for archive of events (for auditing, logging, re- processing) Publishing data from Fusion Apps as data products  OCI Data Integration invokes BICC, which exports data to OCI Object Storage  OCI Data Integration transforms and aggregates BICC data into target dimensional data model  Autonomous Data Warehouse manages data and provides query services eBS JDE Psoft …
  • 41. Specialists Assistance How Cloud Solution Specialists can assist with Lakehouse workloads? 41 Copyright © 2022, Oracle and/or its affiliates | Confidential: Internal
  • 42. Cloud Solution Specialists – Analytics & Lakehouse Design Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 42 We are here to assist you and your Customer on their journey to OCI  We provide assistance throughout the complete sales cycle with discrete activities  We partner and collaborate with Product Management to create assets to be reused with Customers  We engage and help close Oracle Data Platform & Lakehouse workloads  We leverage years of expertise in Analytics & Lakehouse and combine it with best practices to bring value to your Customers Check out our confluence page for more information and reusable assets 1. Workload Qualification 2. Workload Validation 3. Workload Confirmation 4. Adoption 5. Consuming Workshop Workload Solution Assistance Architecture Design Healthcheck Workload Architecture Solution Assistance Lift Implementation Design Healthcheck Workload Architecture
  • 43. Useful URLs Copyright © 2021, Oracle and/or its affiliates | Confidential: Internal 43 Oracle Analytics Cloud (OAC)  Oracle Analytics Cloud - Get Started: https://docs.oracle.com/en/cloud/paas/analytics-cloud/index.html  Oracle Analytics Cloud and Server Roadmap: https://www.oracle.com/business-analytics/cloud-and-server-roadmap.html  What’s New for Oracle Analytics Cloud: https://docs.oracle.com/en/cloud/paas/analytics-cloud/acswn/index.html#ACSWN-GUID-CFF90F44-BCEB-49EE-B40B- 8D040F02D476  Create Services with Oracle Analytics Cloud: https://docs.oracle.com/en/cloud/paas/analytics-cloud/acoci/create-services.html#GUID-47022452-65CC-4345- 8F7F-A447BB24A48A  SaaS Data Replication in Oracle Analytics Cloud (OAC): https://www.ateam-oracle.com/saas-data-replication-in-oracle-analytics-cloud-oac-and-oaac Fusion Cloud  Creating a Business Intelligence Cloud Extract: https://docs.oracle.com/en/cloud/saas/applications-common/21b/biacc/get-started.html#get-started  OCI Data Integration (ODI DI) Help Center: https://docs.oracle.com/en-us/iaas/data-integration/using/index.htm  Load and transform data from Oracle Fusion Cloud Applications to build a data lake or data warehouse: https://blogs.oracle.com/cloud- infrastructure/post/load-and-transform-data-from-oracle-fusion-cloud-applications-to-build-a-data-lake-or-data-warehouse Enterprise Performance Management (EPM)  EPM Automate - Copying a Snapshot to or from Oracle Object Storage: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management- common/cepma/sample_script_15_object_store.html  REST API for Oracle EPM: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-common/prest/index.html  Working with EPM Automate for Oracle EPM: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-common/cepma/index.html
  • 44.
  • 45. Our mission is to help people see data in new ways, discover insights, unlock endless possibilities.

Hinweis der Redaktion

  1. In this session, we are going to explore some of the integration options available to create visualizations and Lakehouse for Oracle applications. We will start by discussing the modern data platform on OCI, the Lakehouse architecture and the OCI related services that supports it. We will then discuss the data extraction methods available on OCI for Fusion and EPM. Will end with a few best practices and possible use cases. In the interest of time, we will mainly focus on integration patterns that are recommended for Fusion and EPM, but don’t hesitate to reach out if you would to talk to us about other Oracle applications. Enjoy!
  2. https://www.oracle.com/business-analytics/analytics-platform/ The Oracle Analytics platform is a cloud native service that provides the capabilities required to address the entire analytics process including data ingestion and modeling, data preparation and enrichment, and visualization
  3. https://www.oracle.com/autonomous-database/ With machine-learning–driven automated tuning, scaling, and patching, Autonomous Database delivers the highest performance, availability, and security for OLTP, analytics, batch, and Internet of Things (IoT) workloads. Autonomous Database’s converged engine supports diverse data types, simplifying application development and deployment from modeling and coding to ETL, database optimization, and data analysis. ########################### Analytic Views ######################## Analytic Views Primary Use case: Visualization agnostic Since AV codifies definition of the business model and calculations, it makes it easy BI users to use their preferred visualization tool (i.e. APEX, OAC*, PowerBI, Tableau…) Enhance data sets for OAC OAC consumes data from Analytic Views via RPD AV facilitates augmented analytics by blending data from disparate sources, whilst also allowing the option to use additional connectors available in OAC to further augment the data Use hierarchical calculations (time series, shares, rankings, etc.) Application development using APEX Simplifies SQL generation No need to express aggregation rules, joins or calculation expressions in queries – just select columns and filter rows AV features/benefits: Defines a dimensional model using hierarchies, levels, attributes and measures Provides presentation metadata (labels, descriptions and other properties) Hierarchy views and analytic views queried with SQL, MDX and REST Supports both dimensional/hierarchical and relational style query A query transformation engine Generates execution SQL from queries selecting from hierarchy views and analytic views Smart query transformation engine generates optimized SQL for query execution Simple, automatic aggregate management Multi-lingual support Business models may be presented in every language supported by the Oracle Database Queries over Object Store Run SQL queries against Object Storage OCI Object Storage, AWS S3, Azure Blob Storage, Google Cloud Platform CSV, Parquet, ORC, JSON, and Avro formats Combine database and Object Storage in SQL Optimized query performance Dynamic auto-scaling Data lake smart scans* Data pruning with partitioned table support Parquet-intelligent reads Automatic and transparent Engages only when necessary Uses auto scale to augment database compute for the life of the query Object store processing is isolated from database cores
  4. Oracle Data Integrator (ODI) provides data migration with its innovative extract, load, and transform (E-L-T) technology, that is optimized for most on-premises and cloud databases. ODI is one of Oracle oldest integration tool. It was added to the integration stack through the acquisition of Sunopsis in 2007. Back then, Sunopsis used database as processing engine but had heterogenous connectivity. Which is the inception of the EL-T concept. EL-T allows users to perform transformations at either source or target E-L-T provides a flexible architecture for optimized performance on any platform Benefits Prebuilt connectors for many databases and technologies Pluggable Knowledge Module Architecture Available on-prem and OCI MarketPlace History: Successor to Oracle’s old Data Integration software Warehouse Builder (OWB). OWB was designed to take advantage of Oracle DB for executing data integration flows. However it was not heterogenous. Oracle acquired Sunopsis in 2007 which also used database as processing engine but had heterogenous connectivity. This became ODI and has gone through many versions maturing in this space. It has a very flexible architecture which allowed it to adapt to changing marketplace. When Big Data came along then we could extend the capability to generate code for Big Data env. Now we are taking the product to adapt to cloud.
  5. Innovative Optimizer for Spark ETL and pushdown E-LT Automatically choses the optimum data process between Spark ETL (extract-transform-load) and push-down E-LT (extract-load-transform) when integrating data for data lakes and analytical systems Service will evaluate multiple transformation plans and optimize for Spark or pushdown processing Pushdown E-LT to eliminate performance degradation on data sources
  6. ODI MP and OCI DI can extract data to Object Storage or UCM Note: BICC extracts to Object Storage flows directly to the target database, as opposed to UCM which would require the files be downloaded to the ODI VM.