SlideShare a Scribd company logo
1 of 44
Download to read offline
Designing Big Data Pipelines
Applying the TOREADOR Methodology
BDVA webinar
Claudio Ardagna, Paolo Ceravolo, Ernesto Damiani
Methodology again
Declarative
Model
Specification
Service
Selection
Procedural
Model
Definition
Workflow
Compiler
Deployment
Model
Execution
Declarative
Specifications
Service
Catalog
Service
Composition
Repository
Deployment
Configurations
Toreador
Platform Big Data
Platform
Tocode-based
Torecipies
Methodology again
Declarative
Model
Specification
Service
Selection
Procedural
Model
Definition
Workflow
Compiler
Deployment
Model
Execution
Declarative
Specifications
Service
Catalog
Service
Composition
Repository
Deployment
Configurations
Toreador
Platform Big Data
Platform
Tocode-based
Torecipies
DS SS SC WC E
Sample Scenario
ā€¢ Infrastructure for pollution monitoring
managed by Lombardia Informatica, an
agency of Lombardy region in Italy.
ā€¢ A network of sensors acquire pollution data
everyday.
ā€¢ sensors, containing information of a
specific acquiring sensor such as ID,
pollutant type, unit of measure
ā€¢ data acquisition stations, managing a set
of sensors and information regarding their
position (e.g. longitude/latitude)
ā€¢ pollution values, containing the values
acquired by sensors, the timestamp, and
the validation status. Each value is
validated by a human operator that
manually labels it as valid or invalid.
ā€¢The goal is to design and
deploy a Big Data pipeline to:
ā€¢ predict the labels of acquired
data in real time
ā€¢ alert the operator when
anomalous values are observed
Reference Scenario
Key Advances
ā€¢ Batch and stream support
Guide the user in selecting a consistent set of services
for both batch and stream computations
ā€¢ Platform independence
Use a smart compiler for generating executable
computations to different platforms
ā€¢ End-to-end verifiability
Include an end-to-end procedure for checking consistency of model specifications
ā€¢ Model reuse and refinement
Support model reuse and refinement
Store declarative, procedural and deployment models as templates to replicate or
extend designs
Queue
Kafka Spark HBase
Display/
Query
Sensor
Data
Compute
Predictive
label
Store
HBase
Without the methodology..
ā€¢Draft the pipeline stages
ā€¢Identify the technology
ā€¢Develop the scripts
ā€¢Deploy
Slow, error-prone, difficult
to reuse..
ā€¢ The pipeline includes two processing stages: training stage and prediction stage
ā€¢ Our DM will include 2 requirement specifications:
DataPreparation.DataTransformation.Filtering;
DataAnalitycs.LearningApproach.Supervised;
DataAnalitycs.LearningStep.Training;
DataAnalitycs.AnalyticsAim.Regression;
DataProcessing.AnalyticsGoal.Batch.
DataAnalitycs.LearningApproach.Supervised;
DataAnalitycs.LearningStep.Prediction;
DataAnalitycs.AnalyticsAim.Regression;
DataProcessing.AnalyticsGoal.Streaming.
Declarative Model
DS1
DS2
ā€¢ Based on the Declarative Models, the TOREADOR (SS) will return a set of
services consistent with DS1 and DS2
ā€¢ The user can easily compose these services to address the scenarioā€™s
goals
Procedural Model
DS1
SS
SC1
DS2
SS
SC2
ā€¢ The two compositions must be connected as the e-gestion of SC1 is the
in-gestion for SC2
Procedural Model
DS1
SS
SC1
DS2
SS
SC2
ā€¢ The two compositions must be connected as the egestion of SC1 is the
ingestion for SC2
Procedural Model
DS1
SS
SC1
DS2
SS
SC2
ā€¢ The two compositions must be connected as the egestion of SC1 is the
ingestion for SC2
Procedural Model
DS1
SS
SC1
DS2
SS
SC2
ā€¢ The TOREADOR compiler translates SC1 and SC2 into executable
orchestrations in a suitable workflow language
Deployment Model
DS1
SS
SC1
DS2
SS
SC2
sparkāˆ’filterāˆ’sensorsTest : filter
āˆ’āˆ’expr=ā€sensorsDF#SensorId === 5958ā€ āˆ’āˆ’ i n p u t P a
t h = ā€ / u s e r / r o o t / s e n s o r s / j o i n e d . c s v ā€
āˆ’āˆ’outputPath=ā€/user/root/sensors test.csvā€ &&
sparkāˆ’assemblerTest : sparkāˆ’assembler
āˆ’āˆ’features=ā€Data,Quoteā€āˆ’āˆ’inputPath=ā€/user/root/sen
sors test.csvā€
āˆ’āˆ’outputPath=ā€/user/root/sensors/sensors test
assembled.csvā€ &&
sparkāˆ’gbtāˆ’predict :
batchāˆ’gradientboostedtreeāˆ’classificationāˆ’predict
āˆ’āˆ’inputPath =/ user / root / sensors / sensors
āˆ’āˆ’outputPath =/ user / root / sensors / sensors āˆ’āˆ’ m o
d e l = / u s e r / r o o t / s e n s o r s / m o d e l
sparkāˆ’filterāˆ’sensorsTest : filter
āˆ’āˆ’expr=ā€sensorsDF#SensorId === 5958ā€ āˆ’āˆ’ i n p u t P a
t h = ā€ / u s e r / r o o t / s e n s o r s / j o i n e d . c s v ā€
āˆ’āˆ’outputPath=ā€/user/root/sensors test.csvā€ &&
sparkāˆ’assemblerTest : sparkāˆ’assembler
āˆ’āˆ’features=ā€Data,Quoteā€āˆ’āˆ’inputPath=ā€/user/root/sen
sors test.csvā€
āˆ’āˆ’outputPath=ā€/user/root/sensors/sensors test
assembled.csvā€ &&
sparkāˆ’gbtāˆ’predict :
batchāˆ’gradientboostedtreeāˆ’classificationāˆ’predict
āˆ’āˆ’inputPath =/ user / root / sensors / sensors
āˆ’āˆ’outputPath =/ user / root / sensors / sensors āˆ’āˆ’ m o
d e l = / u s e r / r o o t / s e n s o r s / m o d e l
WC1
WC2
1-n
Deployment
ā€¢ The execution of WC2 produces the results
Deployment Model
DS1
SS
SC2
WC2
E2
ā€¢ The execution of WC2 produces the results
Deployment Model
DS1
SS
SC2
WC2
E2
The Code-based Line
Code Once/Deploy Everywhere
The Toreador Codel-line user is an expert programmer, aware of the potentialities (flexibility and
controllability) and purposes (analytics developed from scratch or migration of legacy code) of a code-
based approach.
She expresses the parallel computation of a coded algorithm, in terms of parallel primitives.
Toreador distributes it among computational nodes hosted by different Cloud environments.
The resulting computation can be saved as a service for the Service-based line
19
I. Code III. DeployII. Transform
Skeleton-Based
Code Compiler
Code-based compiler
import math
import random
def data_parallel_region(distr, func, *repl):
return [func(x, *repl) for x in distr]
def distance(a, b):
"""Computes euclidean distance between two
vectors"""
return math.sqrt(sum([(x[1]-x[0])**2 for x in zip(a,
b)]))
def kmeans_init(data, k):
"""Returns initial centroids configuration"""
return random.sample(data, k)
def kmeans_assign(p, centroids):
"""Returns the given instance paired to key of
nearest centroid"""
comparator = lambda x: distance(x[1], p)
print (comparator)
Source Code
MapReduce
Bag of Tasks
Producer Consumer
ā€¦......
import math
def data_parallel_region(distr,func, *repl):
return[func(x,*repl) for x in distr]
def distance(a,b):
"""Computes euclidean distancebetween
two vectors"""
returnmath.sqrt(sum([(x[1]-x[0])**2for x
in zip(a, b)]))
def kmeans_init(data,k):
"""Returns initial centroids
configuration"""
returnrandom.sample(data,k)
def kmeans_assign(p,centroids
import math
def data_parallel_region(distr,func, *repl):
return[func(x,*repl) for x in distr]
def distance(a,b):
"""Computes euclidean distancebetween
two vectors"""
returnmath.sqrt(sum([(x[1]-x[0])**2for x
in zip(a, b)]))
def kmeans_init(data,k):
"""Returns initial centroids
configuration"""
returnrandom.sample(data,k)
def kmeans_assign(p,centroids
import math
def data_parallel_region(distr,func, *repl):
return[func(x,*repl) for x in distr]
def distance(a,b):
"""Computes euclidean distancebetween
two vectors"""
returnmath.sqrt(sum([(x[1]-x[0])**2for x
in zip(a, b)]))
def kmeans_init(data,k):
"""Returns initial centroids
configuration"""
returnrandom.sample(data,k)
def kmeans_assign(p,centroids
import math
def data_parallel_region(distr,func, *repl):
return[func(x,*repl) for x in distr]
def distance(a,b):
"""Computes euclidean distancebetween
two vectors"""
returnmath.sqrt(sum([(x[1]-x[0])**2for x
in zip(a, b)]))
def kmeans_init(data,k):
"""Returns initial centroids
configuration"""
returnrandom.sample(data,k)
def kmeans_assign(p,centroids
Skeleton
Secondary
Scripts
Key Advances
ā€¢ Batch and stream support
Guide the user in selecting a consistent set of services
for both batch and stream computations
ā€¢ Platform independence
Use a smart compiler for generating executable
computations to different platforms
ā€¢ End-to-end verifiability
Include an end-to-end procedure for checking consistency of model specifications
ā€¢ Model reuse and refinement
Support model reuse and refinement
Store declarative, procedural and deployment models as templates to replicate or
extend designs
Or each us at info@toreador-project.eu
2017
Want to give it a try? Stay Tuned!
http://www.toreador-project.eu/community/
Thank you
Declarative Model Definition
Declarative Models: vocabulary
ā€¢ Declarative model offers a vocabulary for an computation
independent description of BDA
ā€¢ Organized in 5 areas
ā€¢ Representation (Data Mode, Data Type, Management, Partitioning)
ā€¢ Preparation (Data Reduction, Expansion, Cleaning, Anonymization)
ā€¢ Analytics (Analytics Model, Task, Learning Approach, Expected Quality)
ā€¢ Processing (Analysis Goal, Interaction, Performances)
ā€¢ Visualization and Reporting (Goal, Interaction, Data Dimensionality)
ā€¢ Each specification can be structured in three levels:
ā€¢ Goal: Indicator ā€“ Objective ā€“ Constraint
ā€¢ Feature: Type ā€“ Sub Type ā€“ Sub Sub Type
Declarative Models
ā€¢ A web-based GUI for
specifying the requirements
of a BDA
ā€¢ No coding, for basic users
ā€¢ Analytics services are
provided by the target
TOREADOR platform
ā€¢ Big Data campaign built by
composing existing services
ā€¢ Based on model
transformations
26
Declarative Models
ā€¢ A web-based GUI for
specifying the requirements of
a BDA
ā€¢ Data_Preparation.Data_Source
_Model.Data_Model.
Document_Oriented
ā€¢ Data_Analytics.Analytics_Aim.T
ask.Crisp_Clustering
27
Declarative Models: machine readable
ā€¢ A web-based GUI for
specifying the requirements of
a BDA
ā€¢ Data_Preparation.Data_Source
_Model.Data_Model.
Document_Oriented
ā€¢ Data_Analytics.Analytics_Aim.T
ask.Crisp_Clustering
28
ā€¦
"tdm:label": "Data Representation",
"tdm:incorporates": [
{
"@type": "tdm:Feature",
"tdm:label": "Data Source Model Type",
"tdm:constraint": "{}",
"tdm:incorporates": [
{
"@type": "tdm:Feature",
"tdm:label": "Data Structure",
"tdm:constraint": "{}",
"tdm:visualisationType": "Option",
"tdm:incorporates": [
{
"@type": "tdm:Feature",
"tdm:constraint": "{}",
"tdm:label": "Structured",
"$$hashKey": "object:21"
}
]
},
....
Interference Declaration
ā€¢ A few examples
Data_Preparation.Anonymization. Technique.k-anonymity
ā†’Ā¬ Data_Analitycs.Analitycs_Quality. False_Positive_Rate.low
Data_Preparation.Anonymization. Technique.hashing
ā†’Ā¬ Data_Analitycs.Analitycs_Aim. Task.Crisp_Clustering.algorithm=k-mean
Data_Representation.Storage_Property. Coherence_Model.Strong_Consistency
ā†’Ā¬ Data_Representation.Storage_Property. Partitioning
29
ā€¢ Interference
Declarations
ā€¢ Boolean Interference:
Pā†’Ā¬Q
ā€¢ Intensity of an
Interference: DPāˆ©DQ
ā€¢ Interference
Enforcement
ā€¢ Fuzzy interpretation
max (1-P, 1-Q)
30
Consistency Check
Service-Based Line
Methodology: Building Blocks
ā€¢ Declarative Specifications allow customers to define declarative models
shaping a BDA and retrieve a set of compatible services
ā€¢ Service Catalog specifies the set of abstract services (e.g., algorithms,
mechanisms, or components) that are available to Big Data customers and
consultants for building their BDA
ā€¢ Service Composition Repository permits to specify the procedural model
defining how services can be composed to carry out the Big Data analytics
ā€¢ Support specification of an abstract Big Data service composition
ā€¢ Deployment Configurations define the platform-dependent version of a
procedural model, as a workflow that is ready to be executed on the target
Big Data platform
32
Overview of the Methodology
Declarative
Model
Specification
Service
Selection
Procedural
Model
Definition
Workflow
Compiler
Deployment
Model
Execution
Declarative
Specifications
Service
Catalog
Service
Composition
Repository
Deployment
Configurations
MBDAaaS
Platform Big Data
Platform
Procedural Models
ā€¢ Platform-independent models that formally and
unambiguously describe how analytics should be
configured and executed
ā€¢ They are generated following goals and constraints
specified in the declarative models
ā€¢ They provide a workflow in the form of a service
orchestration
ā€¢ Sequence
ā€¢ Choice
ā€¢ If-then
ā€¢ Do-While
ā€¢ Split-Join
ā€¢ User creates the flow based
on the list of returned
services
Service Composition
ā€¢ User creates the flow based
on the list of returned
services
ā€¢ Services enriched with ad
hoc parameters
Service Composition
ā€¢ User creates the flow based
on the list of returned
services
ā€¢ Services enriched with ad
hoc parameters
ā€¢ The flow is submitted to the
service which translates it
into OWL-S service
composition
Service Composition
ā€¢ All internals are made
explicits
ā€¢ Clear specification of the
services
ā€¢ Reuse and modularity
Service Composition
Deployment Model Definition
Overview of the Methodology
Declarative
Model
Specification
Service
Selection
Procedural
Model
Definition
Workflow
Compiler
Deployment
Model
Execution
Declarative
Specifications
Service
Catalog
Service
Composition
Repository
Deployment
Configurations
MBDAaaS
Platform Big Data
Platform
ā€¢ It consists of two main sub-processes
ā€¢ Structure generation: the compiler parses the procedural model and identifies
the process operators (sequence, alternative, parallel, loop) composing it
ā€¢ Service configuration: for each service in the procedural model the
corresponding one is identified and inserted in the deployment model
ā€¢ Support transformations to any orchestration engine available as a
service
ā€¢ Available for Oozie and Spring XD
Workflow compiler
ā€¢ Workflow compiler takes as input
ā€¢ the OWL-S service composition
ā€¢ information on the target platform (e.g., installed services/algorithms),
ā€¢ It produces as output an executable workflow
ā€¢ For example an Oozie workflow
ā€¢ XML file of the workflow
ā€¢ job.properties
ā€¢ System variables
Deployment Model
Translating the Composition Structure
ā€¢ Deployment models:
ā€¢ specify how procedural models are instantiated and configured on a target platform
ā€¢ drive analytics execution in real scenario
ā€¢ are platform-dependent
ā€¢ Workflow compiler transforms the procedural model in a deployment
model that can be directly executed on the target platform.
ā€¢ This transformation is based on a compiler that takes as input
ā€¢ the OWL-S service composition
ā€¢ information on the target platform (e.g., installed services/algorithms),
ā€¢ and produces as output a technology-dependent workflow
Translating the Composition Structure
ā€¢ OWL-S service composition structure is mapped on different control
constructs
ā€¢ Workflow contain 3 types of
distinct PLACEHOLDER
ā€¢ GREEN placeholders are
SYSTEM variables defined in
Oozie properties
ā€¢ RED placeholders are JOB
variables defined in file
job.properties
ā€¢ YELLOW placeholders are
ARGUMENTS of executable
jobs on OOZIE Server
ā€¢ More on the demoā€¦
Generating an Executable Workflow
Analytics Deployment Approach

More Related Content

Similar to BDVe Webinar Series - Toreador Intro - Designing Big Data pipelines (Paolo Ceravolo)

PETRUCCI_Andrea_Research_Projects_and_Publications
PETRUCCI_Andrea_Research_Projects_and_PublicationsPETRUCCI_Andrea_Research_Projects_and_Publications
PETRUCCI_Andrea_Research_Projects_and_PublicationsAndrea PETRUCCI
Ā 
IMAGE CAPTURE, PROCESSING AND TRANSFER VIA ETHERNET UNDER CONTROL OF MATLAB G...
IMAGE CAPTURE, PROCESSING AND TRANSFER VIA ETHERNET UNDER CONTROL OF MATLAB G...IMAGE CAPTURE, PROCESSING AND TRANSFER VIA ETHERNET UNDER CONTROL OF MATLAB G...
IMAGE CAPTURE, PROCESSING AND TRANSFER VIA ETHERNET UNDER CONTROL OF MATLAB G...Christopher Diamantopoulos
Ā 
Summer training vhdl
Summer training vhdlSummer training vhdl
Summer training vhdlArshit Rai
Ā 
Evolution of deployment tooling @ Chronosphere - CraftConf 2023
Evolution of deployment tooling @ Chronosphere - CraftConf 2023Evolution of deployment tooling @ Chronosphere - CraftConf 2023
Evolution of deployment tooling @ Chronosphere - CraftConf 2023Mary Fesenko
Ā 
Onnc intro
Onnc introOnnc intro
Onnc introLuba Tang
Ā 
Short.course.introduction.to.vhdl for beginners
Short.course.introduction.to.vhdl for beginners Short.course.introduction.to.vhdl for beginners
Short.course.introduction.to.vhdl for beginners Ravi Sony
Ā 
Stoop 305-reflective programming5
Stoop 305-reflective programming5Stoop 305-reflective programming5
Stoop 305-reflective programming5The World of Smalltalk
Ā 
NoC simulators presentation
NoC simulators presentationNoC simulators presentation
NoC simulators presentationHossam Hassan
Ā 
Short.course.introduction.to.vhdl
Short.course.introduction.to.vhdlShort.course.introduction.to.vhdl
Short.course.introduction.to.vhdlRavi Sony
Ā 
Fiware: Connecting to robots
Fiware: Connecting to robotsFiware: Connecting to robots
Fiware: Connecting to robotsJaime Martin Losa
Ā 
Dsp lab manual 15 11-2016
Dsp lab manual 15 11-2016Dsp lab manual 15 11-2016
Dsp lab manual 15 11-2016Gopinath.B.L Naidu
Ā 
Towards a metamodel for the Rubus Component Model
Towards a metamodel for the Rubus Component ModelTowards a metamodel for the Rubus Component Model
Towards a metamodel for the Rubus Component ModelAlessio Bucaioni
Ā 
VET4SBO Level 3 module 1 - unit 2 - 0.009 en
VET4SBO Level 3   module 1 - unit 2 - 0.009 enVET4SBO Level 3   module 1 - unit 2 - 0.009 en
VET4SBO Level 3 module 1 - unit 2 - 0.009 enKarel Van Isacker
Ā 
WebRTC Webinar & Q&A - Sumilcast Standards & Implementation
WebRTC Webinar & Q&A - Sumilcast Standards & ImplementationWebRTC Webinar & Q&A - Sumilcast Standards & Implementation
WebRTC Webinar & Q&A - Sumilcast Standards & ImplementationAmir Zmora
Ā 
Debugging Microservices - QCON 2017
Debugging Microservices - QCON 2017Debugging Microservices - QCON 2017
Debugging Microservices - QCON 2017Idit Levine
Ā 
Automated Deployment of Hetergeneous Service-Oriented System
Automated Deployment of Hetergeneous Service-Oriented SystemAutomated Deployment of Hetergeneous Service-Oriented System
Automated Deployment of Hetergeneous Service-Oriented SystemSander van der Burg
Ā 
Bhadale Group of Companies -Universal Quantum Computer System Design catalogue
Bhadale Group of Companies -Universal Quantum Computer System Design catalogueBhadale Group of Companies -Universal Quantum Computer System Design catalogue
Bhadale Group of Companies -Universal Quantum Computer System Design catalogueVijayananda Mohire
Ā 

Similar to BDVe Webinar Series - Toreador Intro - Designing Big Data pipelines (Paolo Ceravolo) (20)

PETRUCCI_Andrea_Research_Projects_and_Publications
PETRUCCI_Andrea_Research_Projects_and_PublicationsPETRUCCI_Andrea_Research_Projects_and_Publications
PETRUCCI_Andrea_Research_Projects_and_Publications
Ā 
IMAGE CAPTURE, PROCESSING AND TRANSFER VIA ETHERNET UNDER CONTROL OF MATLAB G...
IMAGE CAPTURE, PROCESSING AND TRANSFER VIA ETHERNET UNDER CONTROL OF MATLAB G...IMAGE CAPTURE, PROCESSING AND TRANSFER VIA ETHERNET UNDER CONTROL OF MATLAB G...
IMAGE CAPTURE, PROCESSING AND TRANSFER VIA ETHERNET UNDER CONTROL OF MATLAB G...
Ā 
Summer training vhdl
Summer training vhdlSummer training vhdl
Summer training vhdl
Ā 
JosephAnthonyEAlvarez_CV_2016
JosephAnthonyEAlvarez_CV_2016JosephAnthonyEAlvarez_CV_2016
JosephAnthonyEAlvarez_CV_2016
Ā 
Evolution of deployment tooling @ Chronosphere - CraftConf 2023
Evolution of deployment tooling @ Chronosphere - CraftConf 2023Evolution of deployment tooling @ Chronosphere - CraftConf 2023
Evolution of deployment tooling @ Chronosphere - CraftConf 2023
Ā 
Onnc intro
Onnc introOnnc intro
Onnc intro
Ā 
Short.course.introduction.to.vhdl for beginners
Short.course.introduction.to.vhdl for beginners Short.course.introduction.to.vhdl for beginners
Short.course.introduction.to.vhdl for beginners
Ā 
Stoop 305-reflective programming5
Stoop 305-reflective programming5Stoop 305-reflective programming5
Stoop 305-reflective programming5
Ā 
NoC simulators presentation
NoC simulators presentationNoC simulators presentation
NoC simulators presentation
Ā 
Notifier Tools pdf
Notifier Tools pdfNotifier Tools pdf
Notifier Tools pdf
Ā 
resume2
resume2resume2
resume2
Ā 
Short.course.introduction.to.vhdl
Short.course.introduction.to.vhdlShort.course.introduction.to.vhdl
Short.course.introduction.to.vhdl
Ā 
Fiware: Connecting to robots
Fiware: Connecting to robotsFiware: Connecting to robots
Fiware: Connecting to robots
Ā 
Dsp lab manual 15 11-2016
Dsp lab manual 15 11-2016Dsp lab manual 15 11-2016
Dsp lab manual 15 11-2016
Ā 
Towards a metamodel for the Rubus Component Model
Towards a metamodel for the Rubus Component ModelTowards a metamodel for the Rubus Component Model
Towards a metamodel for the Rubus Component Model
Ā 
VET4SBO Level 3 module 1 - unit 2 - 0.009 en
VET4SBO Level 3   module 1 - unit 2 - 0.009 enVET4SBO Level 3   module 1 - unit 2 - 0.009 en
VET4SBO Level 3 module 1 - unit 2 - 0.009 en
Ā 
WebRTC Webinar & Q&A - Sumilcast Standards & Implementation
WebRTC Webinar & Q&A - Sumilcast Standards & ImplementationWebRTC Webinar & Q&A - Sumilcast Standards & Implementation
WebRTC Webinar & Q&A - Sumilcast Standards & Implementation
Ā 
Debugging Microservices - QCON 2017
Debugging Microservices - QCON 2017Debugging Microservices - QCON 2017
Debugging Microservices - QCON 2017
Ā 
Automated Deployment of Hetergeneous Service-Oriented System
Automated Deployment of Hetergeneous Service-Oriented SystemAutomated Deployment of Hetergeneous Service-Oriented System
Automated Deployment of Hetergeneous Service-Oriented System
Ā 
Bhadale Group of Companies -Universal Quantum Computer System Design catalogue
Bhadale Group of Companies -Universal Quantum Computer System Design catalogueBhadale Group of Companies -Universal Quantum Computer System Design catalogue
Bhadale Group of Companies -Universal Quantum Computer System Design catalogue
Ā 

More from Big Data Value Association

Data Privacy, Security in personal data sharing
Data Privacy, Security in personal data sharingData Privacy, Security in personal data sharing
Data Privacy, Security in personal data sharingBig Data Value Association
Ā 
Key Modules for a trsuted and privacy preserving personal data marketplace
Key Modules for a trsuted and privacy preserving personal data marketplaceKey Modules for a trsuted and privacy preserving personal data marketplace
Key Modules for a trsuted and privacy preserving personal data marketplaceBig Data Value Association
Ā 
GDPR and Data Ethics considerations in personal data sharing
GDPR and Data Ethics considerations in personal data sharingGDPR and Data Ethics considerations in personal data sharing
GDPR and Data Ethics considerations in personal data sharingBig Data Value Association
Ā 
Intro - Three pillars for building a Smart Data Ecosystem: Trust, Security an...
Intro - Three pillars for building a Smart Data Ecosystem: Trust, Security an...Intro - Three pillars for building a Smart Data Ecosystem: Trust, Security an...
Intro - Three pillars for building a Smart Data Ecosystem: Trust, Security an...Big Data Value Association
Ā 
Three pillars for building a Smart Data Ecosystem: Trust, Security and Privacy
Three pillars for building a Smart Data Ecosystem: Trust, Security and PrivacyThree pillars for building a Smart Data Ecosystem: Trust, Security and Privacy
Three pillars for building a Smart Data Ecosystem: Trust, Security and PrivacyBig Data Value Association
Ā 
Market into context - Three pillars for building a Smart Data Ecosystem: Trus...
Market into context - Three pillars for building a Smart Data Ecosystem: Trus...Market into context - Three pillars for building a Smart Data Ecosystem: Trus...
Market into context - Three pillars for building a Smart Data Ecosystem: Trus...Big Data Value Association
Ā 
BDV Skills Accreditation - Future of digital skills in Europe reskilling and ...
BDV Skills Accreditation - Future of digital skills in Europe reskilling and ...BDV Skills Accreditation - Future of digital skills in Europe reskilling and ...
BDV Skills Accreditation - Future of digital skills in Europe reskilling and ...Big Data Value Association
Ā 
BDV Skills Accreditation - Big Data skilling in Emilia-Romagna
BDV Skills Accreditation - Big Data skilling in Emilia-Romagna BDV Skills Accreditation - Big Data skilling in Emilia-Romagna
BDV Skills Accreditation - Big Data skilling in Emilia-Romagna Big Data Value Association
Ā 
BDV Skills Accreditation - EIT labels for professionals
BDV Skills Accreditation - EIT labels for professionalsBDV Skills Accreditation - EIT labels for professionals
BDV Skills Accreditation - EIT labels for professionalsBig Data Value Association
Ā 
BDV Skills Accreditation - Recognizing Data Science Skills with BDV Data Scie...
BDV Skills Accreditation - Recognizing Data Science Skills with BDV Data Scie...BDV Skills Accreditation - Recognizing Data Science Skills with BDV Data Scie...
BDV Skills Accreditation - Recognizing Data Science Skills with BDV Data Scie...Big Data Value Association
Ā 
BDV Skills Accreditation - Objectives of the workshop
BDV Skills Accreditation - Objectives of the workshopBDV Skills Accreditation - Objectives of the workshop
BDV Skills Accreditation - Objectives of the workshopBig Data Value Association
Ā 
BDV Skills Accreditation - Welcome introduction to the workshop
BDV Skills Accreditation - Welcome introduction to the workshopBDV Skills Accreditation - Welcome introduction to the workshop
BDV Skills Accreditation - Welcome introduction to the workshopBig Data Value Association
Ā 
BDV Skills Accreditation - Definition and ensuring of digital roles and compe...
BDV Skills Accreditation - Definition and ensuring of digital roles and compe...BDV Skills Accreditation - Definition and ensuring of digital roles and compe...
BDV Skills Accreditation - Definition and ensuring of digital roles and compe...Big Data Value Association
Ā 
BigDataPilotDemoDays - I BiDaaS Application to the Manufacturing Sector Webinar
BigDataPilotDemoDays - I BiDaaS Application to the Manufacturing Sector WebinarBigDataPilotDemoDays - I BiDaaS Application to the Manufacturing Sector Webinar
BigDataPilotDemoDays - I BiDaaS Application to the Manufacturing Sector WebinarBig Data Value Association
Ā 
BigDataPilotDemoDays - I-BiDaaS Application to the Financial Sector Webinar
BigDataPilotDemoDays - I-BiDaaS Application to the Financial Sector WebinarBigDataPilotDemoDays - I-BiDaaS Application to the Financial Sector Webinar
BigDataPilotDemoDays - I-BiDaaS Application to the Financial Sector WebinarBig Data Value Association
Ā 
Virtual BenchLearning - Data Bench Framework
Virtual BenchLearning - Data Bench FrameworkVirtual BenchLearning - Data Bench Framework
Virtual BenchLearning - Data Bench FrameworkBig Data Value Association
Ā 
Virtual BenchLearning - DeepHealth - Needs & Requirements for Benchmarking
Virtual BenchLearning - DeepHealth - Needs & Requirements for BenchmarkingVirtual BenchLearning - DeepHealth - Needs & Requirements for Benchmarking
Virtual BenchLearning - DeepHealth - Needs & Requirements for BenchmarkingBig Data Value Association
Ā 
Virtual BenchLearning - I-BiDaaS - Industrial-Driven Big Data as a Self-Servi...
Virtual BenchLearning - I-BiDaaS - Industrial-Driven Big Data as a Self-Servi...Virtual BenchLearning - I-BiDaaS - Industrial-Driven Big Data as a Self-Servi...
Virtual BenchLearning - I-BiDaaS - Industrial-Driven Big Data as a Self-Servi...Big Data Value Association
Ā 
Policy Cloud Data Driven Policies against Radicalisation - Technical Overview
Policy Cloud Data Driven Policies against Radicalisation - Technical OverviewPolicy Cloud Data Driven Policies against Radicalisation - Technical Overview
Policy Cloud Data Driven Policies against Radicalisation - Technical OverviewBig Data Value Association
Ā 
Policy Cloud Data Driven Policies against Radicalisation - Participatory poli...
Policy Cloud Data Driven Policies against Radicalisation - Participatory poli...Policy Cloud Data Driven Policies against Radicalisation - Participatory poli...
Policy Cloud Data Driven Policies against Radicalisation - Participatory poli...Big Data Value Association
Ā 

More from Big Data Value Association (20)

Data Privacy, Security in personal data sharing
Data Privacy, Security in personal data sharingData Privacy, Security in personal data sharing
Data Privacy, Security in personal data sharing
Ā 
Key Modules for a trsuted and privacy preserving personal data marketplace
Key Modules for a trsuted and privacy preserving personal data marketplaceKey Modules for a trsuted and privacy preserving personal data marketplace
Key Modules for a trsuted and privacy preserving personal data marketplace
Ā 
GDPR and Data Ethics considerations in personal data sharing
GDPR and Data Ethics considerations in personal data sharingGDPR and Data Ethics considerations in personal data sharing
GDPR and Data Ethics considerations in personal data sharing
Ā 
Intro - Three pillars for building a Smart Data Ecosystem: Trust, Security an...
Intro - Three pillars for building a Smart Data Ecosystem: Trust, Security an...Intro - Three pillars for building a Smart Data Ecosystem: Trust, Security an...
Intro - Three pillars for building a Smart Data Ecosystem: Trust, Security an...
Ā 
Three pillars for building a Smart Data Ecosystem: Trust, Security and Privacy
Three pillars for building a Smart Data Ecosystem: Trust, Security and PrivacyThree pillars for building a Smart Data Ecosystem: Trust, Security and Privacy
Three pillars for building a Smart Data Ecosystem: Trust, Security and Privacy
Ā 
Market into context - Three pillars for building a Smart Data Ecosystem: Trus...
Market into context - Three pillars for building a Smart Data Ecosystem: Trus...Market into context - Three pillars for building a Smart Data Ecosystem: Trus...
Market into context - Three pillars for building a Smart Data Ecosystem: Trus...
Ā 
BDV Skills Accreditation - Future of digital skills in Europe reskilling and ...
BDV Skills Accreditation - Future of digital skills in Europe reskilling and ...BDV Skills Accreditation - Future of digital skills in Europe reskilling and ...
BDV Skills Accreditation - Future of digital skills in Europe reskilling and ...
Ā 
BDV Skills Accreditation - Big Data skilling in Emilia-Romagna
BDV Skills Accreditation - Big Data skilling in Emilia-Romagna BDV Skills Accreditation - Big Data skilling in Emilia-Romagna
BDV Skills Accreditation - Big Data skilling in Emilia-Romagna
Ā 
BDV Skills Accreditation - EIT labels for professionals
BDV Skills Accreditation - EIT labels for professionalsBDV Skills Accreditation - EIT labels for professionals
BDV Skills Accreditation - EIT labels for professionals
Ā 
BDV Skills Accreditation - Recognizing Data Science Skills with BDV Data Scie...
BDV Skills Accreditation - Recognizing Data Science Skills with BDV Data Scie...BDV Skills Accreditation - Recognizing Data Science Skills with BDV Data Scie...
BDV Skills Accreditation - Recognizing Data Science Skills with BDV Data Scie...
Ā 
BDV Skills Accreditation - Objectives of the workshop
BDV Skills Accreditation - Objectives of the workshopBDV Skills Accreditation - Objectives of the workshop
BDV Skills Accreditation - Objectives of the workshop
Ā 
BDV Skills Accreditation - Welcome introduction to the workshop
BDV Skills Accreditation - Welcome introduction to the workshopBDV Skills Accreditation - Welcome introduction to the workshop
BDV Skills Accreditation - Welcome introduction to the workshop
Ā 
BDV Skills Accreditation - Definition and ensuring of digital roles and compe...
BDV Skills Accreditation - Definition and ensuring of digital roles and compe...BDV Skills Accreditation - Definition and ensuring of digital roles and compe...
BDV Skills Accreditation - Definition and ensuring of digital roles and compe...
Ā 
BigDataPilotDemoDays - I BiDaaS Application to the Manufacturing Sector Webinar
BigDataPilotDemoDays - I BiDaaS Application to the Manufacturing Sector WebinarBigDataPilotDemoDays - I BiDaaS Application to the Manufacturing Sector Webinar
BigDataPilotDemoDays - I BiDaaS Application to the Manufacturing Sector Webinar
Ā 
BigDataPilotDemoDays - I-BiDaaS Application to the Financial Sector Webinar
BigDataPilotDemoDays - I-BiDaaS Application to the Financial Sector WebinarBigDataPilotDemoDays - I-BiDaaS Application to the Financial Sector Webinar
BigDataPilotDemoDays - I-BiDaaS Application to the Financial Sector Webinar
Ā 
Virtual BenchLearning - Data Bench Framework
Virtual BenchLearning - Data Bench FrameworkVirtual BenchLearning - Data Bench Framework
Virtual BenchLearning - Data Bench Framework
Ā 
Virtual BenchLearning - DeepHealth - Needs & Requirements for Benchmarking
Virtual BenchLearning - DeepHealth - Needs & Requirements for BenchmarkingVirtual BenchLearning - DeepHealth - Needs & Requirements for Benchmarking
Virtual BenchLearning - DeepHealth - Needs & Requirements for Benchmarking
Ā 
Virtual BenchLearning - I-BiDaaS - Industrial-Driven Big Data as a Self-Servi...
Virtual BenchLearning - I-BiDaaS - Industrial-Driven Big Data as a Self-Servi...Virtual BenchLearning - I-BiDaaS - Industrial-Driven Big Data as a Self-Servi...
Virtual BenchLearning - I-BiDaaS - Industrial-Driven Big Data as a Self-Servi...
Ā 
Policy Cloud Data Driven Policies against Radicalisation - Technical Overview
Policy Cloud Data Driven Policies against Radicalisation - Technical OverviewPolicy Cloud Data Driven Policies against Radicalisation - Technical Overview
Policy Cloud Data Driven Policies against Radicalisation - Technical Overview
Ā 
Policy Cloud Data Driven Policies against Radicalisation - Participatory poli...
Policy Cloud Data Driven Policies against Radicalisation - Participatory poli...Policy Cloud Data Driven Policies against Radicalisation - Participatory poli...
Policy Cloud Data Driven Policies against Radicalisation - Participatory poli...
Ā 

Recently uploaded

Chintamani Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore ...
Chintamani Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore ...Chintamani Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore ...
Chintamani Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore ...amitlee9823
Ā 
100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptx100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptxAnupama Kate
Ā 
Invezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signalsInvezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signalsInvezz1
Ā 
April 2024 - Crypto Market Report's Analysis
April 2024 - Crypto Market Report's AnalysisApril 2024 - Crypto Market Report's Analysis
April 2024 - Crypto Market Report's Analysismanisha194592
Ā 
Junnasandra Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore...Junnasandra Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore...amitlee9823
Ā 
VidaXL dropshipping via API with DroFx.pptx
VidaXL dropshipping via API with DroFx.pptxVidaXL dropshipping via API with DroFx.pptx
VidaXL dropshipping via API with DroFx.pptxolyaivanovalion
Ā 
CebaBaby dropshipping via API with DroFX.pptx
CebaBaby dropshipping via API with DroFX.pptxCebaBaby dropshipping via API with DroFX.pptx
CebaBaby dropshipping via API with DroFX.pptxolyaivanovalion
Ā 
Mature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptxMature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptxolyaivanovalion
Ā 
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxBPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxMohammedJunaid861692
Ā 
BDSMāš”Call Girls in Mandawali Delhi >ą¼’8448380779 Escort Service
BDSMāš”Call Girls in Mandawali Delhi >ą¼’8448380779 Escort ServiceBDSMāš”Call Girls in Mandawali Delhi >ą¼’8448380779 Escort Service
BDSMāš”Call Girls in Mandawali Delhi >ą¼’8448380779 Escort ServiceDelhi Call girls
Ā 
BigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxBigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxolyaivanovalion
Ā 
Vip Model Call Girls (Delhi) Karol Bagh 9711199171āœ”ļøBody to body massage wit...
Vip Model  Call Girls (Delhi) Karol Bagh 9711199171āœ”ļøBody to body massage wit...Vip Model  Call Girls (Delhi) Karol Bagh 9711199171āœ”ļøBody to body massage wit...
Vip Model Call Girls (Delhi) Karol Bagh 9711199171āœ”ļøBody to body massage wit...shivangimorya083
Ā 
Delhi Call Girls Punjabi Bagh 9711199171 ā˜Žāœ”šŸ‘Œāœ” Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ā˜Žāœ”šŸ‘Œāœ” Whatsapp Hard And Sexy Vip CallDelhi Call Girls Punjabi Bagh 9711199171 ā˜Žāœ”šŸ‘Œāœ” Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ā˜Žāœ”šŸ‘Œāœ” Whatsapp Hard And Sexy Vip Callshivangimorya083
Ā 
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779Best VIP Call Girls Noida Sector 22 Call Me: 8448380779
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779Delhi Call girls
Ā 
Call Girls Hsr Layout Just Call šŸ‘— 7737669865 šŸ‘— Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call šŸ‘— 7737669865 šŸ‘— Top Class Call Girl Service Ba...Call Girls Hsr Layout Just Call šŸ‘— 7737669865 šŸ‘— Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call šŸ‘— 7737669865 šŸ‘— Top Class Call Girl Service Ba...amitlee9823
Ā 
Ravak dropshipping via API with DroFx.pptx
Ravak dropshipping via API with DroFx.pptxRavak dropshipping via API with DroFx.pptx
Ravak dropshipping via API with DroFx.pptxolyaivanovalion
Ā 
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% SecureCall me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% SecurePooja Nehwal
Ā 
Determinants of health, dimensions of health, positive health and spectrum of...
Determinants of health, dimensions of health, positive health and spectrum of...Determinants of health, dimensions of health, positive health and spectrum of...
Determinants of health, dimensions of health, positive health and spectrum of...shambhavirathore45
Ā 
Midocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFxMidocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFxolyaivanovalion
Ā 

Recently uploaded (20)

Chintamani Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore ...
Chintamani Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore ...Chintamani Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore ...
Chintamani Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore ...
Ā 
100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptx100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptx
Ā 
Invezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signalsInvezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signals
Ā 
April 2024 - Crypto Market Report's Analysis
April 2024 - Crypto Market Report's AnalysisApril 2024 - Crypto Market Report's Analysis
April 2024 - Crypto Market Report's Analysis
Ā 
Junnasandra Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore...Junnasandra Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: šŸ“ 7737669865 šŸ“ High Profile Model Escorts | Bangalore...
Ā 
VidaXL dropshipping via API with DroFx.pptx
VidaXL dropshipping via API with DroFx.pptxVidaXL dropshipping via API with DroFx.pptx
VidaXL dropshipping via API with DroFx.pptx
Ā 
CebaBaby dropshipping via API with DroFX.pptx
CebaBaby dropshipping via API with DroFX.pptxCebaBaby dropshipping via API with DroFX.pptx
CebaBaby dropshipping via API with DroFX.pptx
Ā 
Mature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptxMature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptx
Ā 
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxBPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
Ā 
BDSMāš”Call Girls in Mandawali Delhi >ą¼’8448380779 Escort Service
BDSMāš”Call Girls in Mandawali Delhi >ą¼’8448380779 Escort ServiceBDSMāš”Call Girls in Mandawali Delhi >ą¼’8448380779 Escort Service
BDSMāš”Call Girls in Mandawali Delhi >ą¼’8448380779 Escort Service
Ā 
BigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxBigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptx
Ā 
(NEHA) Call Girls Katra Call Now 8617697112 Katra Escorts 24x7
(NEHA) Call Girls Katra Call Now 8617697112 Katra Escorts 24x7(NEHA) Call Girls Katra Call Now 8617697112 Katra Escorts 24x7
(NEHA) Call Girls Katra Call Now 8617697112 Katra Escorts 24x7
Ā 
Vip Model Call Girls (Delhi) Karol Bagh 9711199171āœ”ļøBody to body massage wit...
Vip Model  Call Girls (Delhi) Karol Bagh 9711199171āœ”ļøBody to body massage wit...Vip Model  Call Girls (Delhi) Karol Bagh 9711199171āœ”ļøBody to body massage wit...
Vip Model Call Girls (Delhi) Karol Bagh 9711199171āœ”ļøBody to body massage wit...
Ā 
Delhi Call Girls Punjabi Bagh 9711199171 ā˜Žāœ”šŸ‘Œāœ” Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ā˜Žāœ”šŸ‘Œāœ” Whatsapp Hard And Sexy Vip CallDelhi Call Girls Punjabi Bagh 9711199171 ā˜Žāœ”šŸ‘Œāœ” Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ā˜Žāœ”šŸ‘Œāœ” Whatsapp Hard And Sexy Vip Call
Ā 
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779Best VIP Call Girls Noida Sector 22 Call Me: 8448380779
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779
Ā 
Call Girls Hsr Layout Just Call šŸ‘— 7737669865 šŸ‘— Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call šŸ‘— 7737669865 šŸ‘— Top Class Call Girl Service Ba...Call Girls Hsr Layout Just Call šŸ‘— 7737669865 šŸ‘— Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call šŸ‘— 7737669865 šŸ‘— Top Class Call Girl Service Ba...
Ā 
Ravak dropshipping via API with DroFx.pptx
Ravak dropshipping via API with DroFx.pptxRavak dropshipping via API with DroFx.pptx
Ravak dropshipping via API with DroFx.pptx
Ā 
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% SecureCall me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Ā 
Determinants of health, dimensions of health, positive health and spectrum of...
Determinants of health, dimensions of health, positive health and spectrum of...Determinants of health, dimensions of health, positive health and spectrum of...
Determinants of health, dimensions of health, positive health and spectrum of...
Ā 
Midocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFxMidocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFx
Ā 

BDVe Webinar Series - Toreador Intro - Designing Big Data pipelines (Paolo Ceravolo)

  • 1. Designing Big Data Pipelines Applying the TOREADOR Methodology BDVA webinar Claudio Ardagna, Paolo Ceravolo, Ernesto Damiani
  • 4. Sample Scenario ā€¢ Infrastructure for pollution monitoring managed by Lombardia Informatica, an agency of Lombardy region in Italy. ā€¢ A network of sensors acquire pollution data everyday. ā€¢ sensors, containing information of a specific acquiring sensor such as ID, pollutant type, unit of measure ā€¢ data acquisition stations, managing a set of sensors and information regarding their position (e.g. longitude/latitude) ā€¢ pollution values, containing the values acquired by sensors, the timestamp, and the validation status. Each value is validated by a human operator that manually labels it as valid or invalid.
  • 5. ā€¢The goal is to design and deploy a Big Data pipeline to: ā€¢ predict the labels of acquired data in real time ā€¢ alert the operator when anomalous values are observed Reference Scenario
  • 6. Key Advances ā€¢ Batch and stream support Guide the user in selecting a consistent set of services for both batch and stream computations ā€¢ Platform independence Use a smart compiler for generating executable computations to different platforms ā€¢ End-to-end verifiability Include an end-to-end procedure for checking consistency of model specifications ā€¢ Model reuse and refinement Support model reuse and refinement Store declarative, procedural and deployment models as templates to replicate or extend designs
  • 7. Queue Kafka Spark HBase Display/ Query Sensor Data Compute Predictive label Store HBase Without the methodology.. ā€¢Draft the pipeline stages ā€¢Identify the technology ā€¢Develop the scripts ā€¢Deploy Slow, error-prone, difficult to reuse..
  • 8. ā€¢ The pipeline includes two processing stages: training stage and prediction stage ā€¢ Our DM will include 2 requirement specifications: DataPreparation.DataTransformation.Filtering; DataAnalitycs.LearningApproach.Supervised; DataAnalitycs.LearningStep.Training; DataAnalitycs.AnalyticsAim.Regression; DataProcessing.AnalyticsGoal.Batch. DataAnalitycs.LearningApproach.Supervised; DataAnalitycs.LearningStep.Prediction; DataAnalitycs.AnalyticsAim.Regression; DataProcessing.AnalyticsGoal.Streaming. Declarative Model DS1 DS2
  • 9. ā€¢ Based on the Declarative Models, the TOREADOR (SS) will return a set of services consistent with DS1 and DS2 ā€¢ The user can easily compose these services to address the scenarioā€™s goals Procedural Model DS1 SS SC1 DS2 SS SC2
  • 10. ā€¢ The two compositions must be connected as the e-gestion of SC1 is the in-gestion for SC2 Procedural Model DS1 SS SC1 DS2 SS SC2
  • 11. ā€¢ The two compositions must be connected as the egestion of SC1 is the ingestion for SC2 Procedural Model DS1 SS SC1 DS2 SS SC2
  • 12. ā€¢ The two compositions must be connected as the egestion of SC1 is the ingestion for SC2 Procedural Model DS1 SS SC1 DS2 SS SC2
  • 13. ā€¢ The TOREADOR compiler translates SC1 and SC2 into executable orchestrations in a suitable workflow language Deployment Model DS1 SS SC1 DS2 SS SC2 sparkāˆ’filterāˆ’sensorsTest : filter āˆ’āˆ’expr=ā€sensorsDF#SensorId === 5958ā€ āˆ’āˆ’ i n p u t P a t h = ā€ / u s e r / r o o t / s e n s o r s / j o i n e d . c s v ā€ āˆ’āˆ’outputPath=ā€/user/root/sensors test.csvā€ && sparkāˆ’assemblerTest : sparkāˆ’assembler āˆ’āˆ’features=ā€Data,Quoteā€āˆ’āˆ’inputPath=ā€/user/root/sen sors test.csvā€ āˆ’āˆ’outputPath=ā€/user/root/sensors/sensors test assembled.csvā€ && sparkāˆ’gbtāˆ’predict : batchāˆ’gradientboostedtreeāˆ’classificationāˆ’predict āˆ’āˆ’inputPath =/ user / root / sensors / sensors āˆ’āˆ’outputPath =/ user / root / sensors / sensors āˆ’āˆ’ m o d e l = / u s e r / r o o t / s e n s o r s / m o d e l sparkāˆ’filterāˆ’sensorsTest : filter āˆ’āˆ’expr=ā€sensorsDF#SensorId === 5958ā€ āˆ’āˆ’ i n p u t P a t h = ā€ / u s e r / r o o t / s e n s o r s / j o i n e d . c s v ā€ āˆ’āˆ’outputPath=ā€/user/root/sensors test.csvā€ && sparkāˆ’assemblerTest : sparkāˆ’assembler āˆ’āˆ’features=ā€Data,Quoteā€āˆ’āˆ’inputPath=ā€/user/root/sen sors test.csvā€ āˆ’āˆ’outputPath=ā€/user/root/sensors/sensors test assembled.csvā€ && sparkāˆ’gbtāˆ’predict : batchāˆ’gradientboostedtreeāˆ’classificationāˆ’predict āˆ’āˆ’inputPath =/ user / root / sensors / sensors āˆ’āˆ’outputPath =/ user / root / sensors / sensors āˆ’āˆ’ m o d e l = / u s e r / r o o t / s e n s o r s / m o d e l WC1 WC2 1-n
  • 15. ā€¢ The execution of WC2 produces the results Deployment Model DS1 SS SC2 WC2 E2
  • 16. ā€¢ The execution of WC2 produces the results Deployment Model DS1 SS SC2 WC2 E2
  • 17. The Code-based Line Code Once/Deploy Everywhere The Toreador Codel-line user is an expert programmer, aware of the potentialities (flexibility and controllability) and purposes (analytics developed from scratch or migration of legacy code) of a code- based approach. She expresses the parallel computation of a coded algorithm, in terms of parallel primitives. Toreador distributes it among computational nodes hosted by different Cloud environments. The resulting computation can be saved as a service for the Service-based line 19 I. Code III. DeployII. Transform Skeleton-Based Code Compiler
  • 18. Code-based compiler import math import random def data_parallel_region(distr, func, *repl): return [func(x, *repl) for x in distr] def distance(a, b): """Computes euclidean distance between two vectors""" return math.sqrt(sum([(x[1]-x[0])**2 for x in zip(a, b)])) def kmeans_init(data, k): """Returns initial centroids configuration""" return random.sample(data, k) def kmeans_assign(p, centroids): """Returns the given instance paired to key of nearest centroid""" comparator = lambda x: distance(x[1], p) print (comparator) Source Code MapReduce Bag of Tasks Producer Consumer ā€¦...... import math def data_parallel_region(distr,func, *repl): return[func(x,*repl) for x in distr] def distance(a,b): """Computes euclidean distancebetween two vectors""" returnmath.sqrt(sum([(x[1]-x[0])**2for x in zip(a, b)])) def kmeans_init(data,k): """Returns initial centroids configuration""" returnrandom.sample(data,k) def kmeans_assign(p,centroids import math def data_parallel_region(distr,func, *repl): return[func(x,*repl) for x in distr] def distance(a,b): """Computes euclidean distancebetween two vectors""" returnmath.sqrt(sum([(x[1]-x[0])**2for x in zip(a, b)])) def kmeans_init(data,k): """Returns initial centroids configuration""" returnrandom.sample(data,k) def kmeans_assign(p,centroids import math def data_parallel_region(distr,func, *repl): return[func(x,*repl) for x in distr] def distance(a,b): """Computes euclidean distancebetween two vectors""" returnmath.sqrt(sum([(x[1]-x[0])**2for x in zip(a, b)])) def kmeans_init(data,k): """Returns initial centroids configuration""" returnrandom.sample(data,k) def kmeans_assign(p,centroids import math def data_parallel_region(distr,func, *repl): return[func(x,*repl) for x in distr] def distance(a,b): """Computes euclidean distancebetween two vectors""" returnmath.sqrt(sum([(x[1]-x[0])**2for x in zip(a, b)])) def kmeans_init(data,k): """Returns initial centroids configuration""" returnrandom.sample(data,k) def kmeans_assign(p,centroids Skeleton Secondary Scripts
  • 19. Key Advances ā€¢ Batch and stream support Guide the user in selecting a consistent set of services for both batch and stream computations ā€¢ Platform independence Use a smart compiler for generating executable computations to different platforms ā€¢ End-to-end verifiability Include an end-to-end procedure for checking consistency of model specifications ā€¢ Model reuse and refinement Support model reuse and refinement Store declarative, procedural and deployment models as templates to replicate or extend designs
  • 20. Or each us at info@toreador-project.eu 2017 Want to give it a try? Stay Tuned! http://www.toreador-project.eu/community/
  • 23. Declarative Models: vocabulary ā€¢ Declarative model offers a vocabulary for an computation independent description of BDA ā€¢ Organized in 5 areas ā€¢ Representation (Data Mode, Data Type, Management, Partitioning) ā€¢ Preparation (Data Reduction, Expansion, Cleaning, Anonymization) ā€¢ Analytics (Analytics Model, Task, Learning Approach, Expected Quality) ā€¢ Processing (Analysis Goal, Interaction, Performances) ā€¢ Visualization and Reporting (Goal, Interaction, Data Dimensionality) ā€¢ Each specification can be structured in three levels: ā€¢ Goal: Indicator ā€“ Objective ā€“ Constraint ā€¢ Feature: Type ā€“ Sub Type ā€“ Sub Sub Type
  • 24. Declarative Models ā€¢ A web-based GUI for specifying the requirements of a BDA ā€¢ No coding, for basic users ā€¢ Analytics services are provided by the target TOREADOR platform ā€¢ Big Data campaign built by composing existing services ā€¢ Based on model transformations 26
  • 25. Declarative Models ā€¢ A web-based GUI for specifying the requirements of a BDA ā€¢ Data_Preparation.Data_Source _Model.Data_Model. Document_Oriented ā€¢ Data_Analytics.Analytics_Aim.T ask.Crisp_Clustering 27
  • 26. Declarative Models: machine readable ā€¢ A web-based GUI for specifying the requirements of a BDA ā€¢ Data_Preparation.Data_Source _Model.Data_Model. Document_Oriented ā€¢ Data_Analytics.Analytics_Aim.T ask.Crisp_Clustering 28 ā€¦ "tdm:label": "Data Representation", "tdm:incorporates": [ { "@type": "tdm:Feature", "tdm:label": "Data Source Model Type", "tdm:constraint": "{}", "tdm:incorporates": [ { "@type": "tdm:Feature", "tdm:label": "Data Structure", "tdm:constraint": "{}", "tdm:visualisationType": "Option", "tdm:incorporates": [ { "@type": "tdm:Feature", "tdm:constraint": "{}", "tdm:label": "Structured", "$$hashKey": "object:21" } ] }, ....
  • 27. Interference Declaration ā€¢ A few examples Data_Preparation.Anonymization. Technique.k-anonymity ā†’Ā¬ Data_Analitycs.Analitycs_Quality. False_Positive_Rate.low Data_Preparation.Anonymization. Technique.hashing ā†’Ā¬ Data_Analitycs.Analitycs_Aim. Task.Crisp_Clustering.algorithm=k-mean Data_Representation.Storage_Property. Coherence_Model.Strong_Consistency ā†’Ā¬ Data_Representation.Storage_Property. Partitioning 29
  • 28. ā€¢ Interference Declarations ā€¢ Boolean Interference: Pā†’Ā¬Q ā€¢ Intensity of an Interference: DPāˆ©DQ ā€¢ Interference Enforcement ā€¢ Fuzzy interpretation max (1-P, 1-Q) 30 Consistency Check
  • 30. Methodology: Building Blocks ā€¢ Declarative Specifications allow customers to define declarative models shaping a BDA and retrieve a set of compatible services ā€¢ Service Catalog specifies the set of abstract services (e.g., algorithms, mechanisms, or components) that are available to Big Data customers and consultants for building their BDA ā€¢ Service Composition Repository permits to specify the procedural model defining how services can be composed to carry out the Big Data analytics ā€¢ Support specification of an abstract Big Data service composition ā€¢ Deployment Configurations define the platform-dependent version of a procedural model, as a workflow that is ready to be executed on the target Big Data platform 32
  • 31. Overview of the Methodology Declarative Model Specification Service Selection Procedural Model Definition Workflow Compiler Deployment Model Execution Declarative Specifications Service Catalog Service Composition Repository Deployment Configurations MBDAaaS Platform Big Data Platform
  • 32. Procedural Models ā€¢ Platform-independent models that formally and unambiguously describe how analytics should be configured and executed ā€¢ They are generated following goals and constraints specified in the declarative models ā€¢ They provide a workflow in the form of a service orchestration ā€¢ Sequence ā€¢ Choice ā€¢ If-then ā€¢ Do-While ā€¢ Split-Join
  • 33. ā€¢ User creates the flow based on the list of returned services Service Composition
  • 34. ā€¢ User creates the flow based on the list of returned services ā€¢ Services enriched with ad hoc parameters Service Composition
  • 35. ā€¢ User creates the flow based on the list of returned services ā€¢ Services enriched with ad hoc parameters ā€¢ The flow is submitted to the service which translates it into OWL-S service composition Service Composition
  • 36. ā€¢ All internals are made explicits ā€¢ Clear specification of the services ā€¢ Reuse and modularity Service Composition
  • 38. Overview of the Methodology Declarative Model Specification Service Selection Procedural Model Definition Workflow Compiler Deployment Model Execution Declarative Specifications Service Catalog Service Composition Repository Deployment Configurations MBDAaaS Platform Big Data Platform
  • 39. ā€¢ It consists of two main sub-processes ā€¢ Structure generation: the compiler parses the procedural model and identifies the process operators (sequence, alternative, parallel, loop) composing it ā€¢ Service configuration: for each service in the procedural model the corresponding one is identified and inserted in the deployment model ā€¢ Support transformations to any orchestration engine available as a service ā€¢ Available for Oozie and Spring XD Workflow compiler
  • 40. ā€¢ Workflow compiler takes as input ā€¢ the OWL-S service composition ā€¢ information on the target platform (e.g., installed services/algorithms), ā€¢ It produces as output an executable workflow ā€¢ For example an Oozie workflow ā€¢ XML file of the workflow ā€¢ job.properties ā€¢ System variables Deployment Model
  • 41. Translating the Composition Structure ā€¢ Deployment models: ā€¢ specify how procedural models are instantiated and configured on a target platform ā€¢ drive analytics execution in real scenario ā€¢ are platform-dependent ā€¢ Workflow compiler transforms the procedural model in a deployment model that can be directly executed on the target platform. ā€¢ This transformation is based on a compiler that takes as input ā€¢ the OWL-S service composition ā€¢ information on the target platform (e.g., installed services/algorithms), ā€¢ and produces as output a technology-dependent workflow
  • 42. Translating the Composition Structure ā€¢ OWL-S service composition structure is mapped on different control constructs
  • 43. ā€¢ Workflow contain 3 types of distinct PLACEHOLDER ā€¢ GREEN placeholders are SYSTEM variables defined in Oozie properties ā€¢ RED placeholders are JOB variables defined in file job.properties ā€¢ YELLOW placeholders are ARGUMENTS of executable jobs on OOZIE Server ā€¢ More on the demoā€¦ Generating an Executable Workflow