Diese Präsentation wurde erfolgreich gemeldet.
Die SlideShare-Präsentation wird heruntergeladen. ×

Scaling a backend for a big data and blockchain environment by Rafael Ríos at Big Data Spain 2017

Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige

Hier ansehen

1 von 27 Anzeige

Scaling a backend for a big data and blockchain environment by Rafael Ríos at Big Data Spain 2017

Herunterladen, um offline zu lesen

2gether is a financial platform based on Blockchain, Big Data and Artificial Intelligence that allows interaction between users and third-party services in a single interface.

https://www.bigdataspain.org/2017/talk/scaling-a-backend-for-a-big-data-and-blockchain-environment

Big Data Spain 2017
November 16th - 17th Kinépolis Madrid

2gether is a financial platform based on Blockchain, Big Data and Artificial Intelligence that allows interaction between users and third-party services in a single interface.

https://www.bigdataspain.org/2017/talk/scaling-a-backend-for-a-big-data-and-blockchain-environment

Big Data Spain 2017
November 16th - 17th Kinépolis Madrid

Anzeige
Anzeige

Weitere Verwandte Inhalte

Diashows für Sie (20)

Ähnlich wie Scaling a backend for a big data and blockchain environment by Rafael Ríos at Big Data Spain 2017 (20)

Anzeige

Weitere von Big Data Spain (20)

Aktuellste (20)

Anzeige

Scaling a backend for a big data and blockchain environment by Rafael Ríos at Big Data Spain 2017

  1. 1. Scaling a backend for big data and blockchain environment
  2. 2. 1. Introduction (The companies, project and me) 2. Backend challenge 3. Big Data Integration 4. Blockchain Integration ·············· P. 3 ··················································· ················ P. 7 ··················································· ·············· P. 17 ··················································· ········ P. 21
  3. 3. @ganchix Head of Blockchain / Backend Me ganchix
  4. 4. The companies
  5. 5. Fecha The project • Financial platform • Marketplace • Tokenization • Operate with cryptocurrency • Liquidity predictive models • Credit scoring • Product recommendation
  6. 6. Backend challenge - The evolution Dapp Smart Contract Ethereum
  7. 7. Backend challenge - The evolution
  8. 8. Backend challenge - The evolution Dapps problem: • Noncryptocurrency users • Problem with some integrations • Legal problem
  9. 9. Backend challenge - The evolution
  10. 10. Backend challenge - The evolution
  11. 11. Fecha Backend challenge - Why microservices? • Migration of Dapp easier • Easy to scale • Polyglot Database and Languages
  12. 12. Why don't use exclusively blockchain with a database?
  13. 13. 1. Spring Cloud Netflix and Kubernetes • Easy to learn. • Nice integrations • Spring 5 reactive 2. Docker • Most adopted vendor technology for containers • Well supported 3. Kubernetes • Multi-cloud provider and on-premises data centers • Self-repair and health check capabilities • Auto-scale Backend challenge - Microservice Architecture Stack
  14. 14. Backend challenge - Microservice Architecture Stack
  15. 15. Backend challenge - Deployment
  16. 16. PFM values generation from user data. Apache Spark + Cassandra Forecast prediction and regeneration of this models Apache Spark + Cassandra Product recommendations based on the economic profile of the user and his real needs. Apache Spark + Cassandra + Neo4j Credit scoring calculation Apache Spark + Cassandra Big Data Integration - Tasks
  17. 17. • Tasks are hard, needs: • Time • Resources • Not Real Time is needed. • Event Driven Architecture. Big Data Integration - Events
  18. 18. Big Data Integration - How?
  19. 19. Big Data Integration - RabbitMQ vs Kafka KafkaRabbitMq • RabbitMQ is designed as a general purpose message broker • Support existing protocols like AMQP, STOMP, MQTT. • Finer-grained consistency control/guarantees on a peer- message. • Complex routing. • Apache Kafka is designed for high volume publish-subscribe messages and streams, meant to be durable, fast, and scalable. • Event Sourcing • Your application needs access to stream history. • No complex routing. https://content.pivotal.io/blog/understanding-when-to-use-rabbitmq-or-apache-kafka
  20. 20. • Deployed in Kubernetes. • Only accessible by NodeJS API. • All keys are stored in secrets vaults. • Used for: • Tokenization • Transactions of users Blockchain Integration - Private Ethereum
  21. 21. Blockchain Integration - Private Ethereum
  22. 22. Blockchain Integration - Ethereum Main Net
  23. 23. Blockchain Integration - Ethereum Main Net
  24. 24. Blockchain Integration - Ethereum Main Net • We are the owner of the wallets • We use Infura to connect blockchain • Used for: • Payment • Transfers
  25. 25. @ganchix @2getherbank @bigeeksoftwa re Thanks! ICO is coming!

×