Filter (clear filters)





Overview for aws-s3

Fannie Mae Processes over a Quarter Million Loans per Day w/ Amazon S3

Fannie Mae discusses how they completely re-architected a mission-critical application using AWS native services that process hundreds of thousands of mortgage loans every day in a highly scalable and reliable manner. The transaction-heavy workload uses over 20+ million Amazon S3 transactions a day, each within 150-millisecond response times, thus providing increased uptime and faster response. 


How Uses Amazon SageMaker & AWS Glue to Enable Machine Learning processes over 40 million documents a year that customers upload into their system. These documents arrive unstructured, random, and in multiple formats. This talk describes how uses AWS for their Secure Data Platform infrastructure in order to enable machine learning. Kyle Guichard, Sr. Director Systems Engineering at, highlights how they leverage Amazon S3, Amazon SageMaker, and AWS Glue to enable the next generation of their Intelligent Virtual Assistant for zero data entry, which results in increased accuracy and efficiency for digital business payments for their customers. 


Vonage & Aspect: Transform Communications & Customer Engagement

In this discussion you will learn from a market-leader Vonage how and why they re-architected their QoS-sensitive, highly available and highly performant legacy real-time communications systems to take advantage of Amazon EC2, Enhanced Networking, Amazon S3, ASG, Amazon RDS, Amazon ElastiCache, AWS Lambda, StepFunctions, Amazon SNS, Amazon SQS, Amazon Kinesis, Amazon EFS, and more. You will also learn how Aspect, a multinational leader in call center solutions, used AWS Lambda, Amazon API Gateway, Amazon Kinesis, Amazon ElastiCache, Amazon Cognito, and Application Load Balancer with open-source API development tooling from Swagger, to build a comprehensive, microservices-based solution. Vonage and Aspect share their journey to TCO optimization, global outreach, and agility with best practices and insights.


How Robinhood Used AWS to Make a Self-Service Data Platform

This discussion is about how Robinhood used AWS tools, such as Amazon S3, Amazon Athena, Amazon EMR, AWS Glue, and Amazon Redshift, to build a robust data lake that can operate on petabyte scale. You will learn the design paradigms and tradeoffs made to achieve a cost-effective and performant cluster that unifies all data access, analytics, and machine learning use cases.


Pinterest’s Story of Streaming Hundreds of Terabytes of Pins from MySQL to S3/Hadoop Continuously

This talk discusses how Pinterest designed and built a continuous database (DB) ingestion system for moving MySQL data into near-real-time computation pipelines with only 15 minutes of latency to support their dynamic personalized recommendations and search indices. Pinterest is moving towards real-time computation, they are facing a stringent service-level agreement requirement such as making the MySQL data available on S3/Hadoop within 15 minutes, and serving the DB data incrementally in stream processing. The data team has designed WaterMill: a continuous DB ingestion system to listen for MySQL binlog changes, publish the MySQL changelogs as an Apache Kafka® change stream and ingest and compact the stream into Parquet columnar tables in S3/Hadoop within 15 minutes. 


Logitech Accelerates Cloud Analytics Using Data Virtualization by Avinash Deshpande

Learn how Logitech  has adopted the AWS platform and big data on the cloud for all of their analytical needs, including Amazon Redshift and S3. In this presentation, Avinash Deshpande presents the business rationale for migrating to the cloud. How data virtualization enables the migration. Running data virtualization itself in the cloud.