After setting up the environment, it is now time to simulate the beer ratings flowing in. As explained, I will start off several generators simultaneously. To generate some (intended) data skew away from the average, several generators will have the same structural event definition, however they will be different in the combination of users, beers and ratings upper and lower bounds. Of course this is also based on my personal preference – who said my demonstration scenario should be fair?
Milco Numan
Kafka Streaming – Setting Up
Introduction
In this blog, I am going to zoom into KSQL and the opportunities it offers for manipulating streaming data in Kafka, by merely using SQL-like statements. One of the neat things about the Confluent Kafka platform, is that it provides additional utilities on top of the core Kafka tools. One of these utilities is the ksql-datagen, which allows users to generate random data based on a simple schema definition in Apache Avro.
Kafka streaming
Introduction
For a long time I have been interested in Apache Kafka and its applications. Unfortunately, forced by circumstances, work and other personal endevours, I had not been able to really dive deeper into the matters until Spring 2019. In April I have finally finished the Udemy course “Apache Kafka for Beginners“.
At work, my exposure to Kafka had only been limited, as we were (ultimately) publishing messages onto a Kafka topic using Oracle Service Bus. However, this was actually a Java-built integration, as we wer just pushing the messages onto a JMS queue, which had a MDB listening that propagated the messages to the Kafka cluster.
After completing the first training I got interested, especially in the role of Kafka in real-time event systems and I decided to take another course on Kafka Streams. I was a bit disappointed that this specific course focussed on the Java development quite heavily, and as an exception I decided to abandon the course uncompleted. During one of the Kafka Meetups, I found out that Confluent was actually offering a very interesting alternative to programming the Kafka Streams API in Java, viz. KSQL.
Custom CloudWatch Metrics – revisited
In my previous blog I have investigated how to create custom metrics for an EC2 instance using Python and Boto3, the AWS SDK for Python. At the time I was specifically interested in demonstrating that it was not necessary to use these “old fashioned” PERL scripts to post custom metrics from your EC2 instance, but you could also do this yourself rather easily using Python. All steps I’ve taken in the process have been performed manually, but that’s not really something you’d like to do whenever there’s more than [insert your threshold here] instances to provision.
Custom CloudWatch Metrics in Python? Yes we can!
Introduction
Currently, AWS has three associate level certifications: Solutions Architect, Developer and SysOps Administrator. In 2017, I have passed the certification exams for the first two certifications and as I visited AWS re:Invent in Las Vegas last year, I decided that I would like to also pass the one that is missing from my curriculum right now.
On Course For Certification
One of the ways to prepare for the AWS Associate certifications (apart from reading the documentation), is completing one of the many courses being offered on the on-line learning platforms. For my first two certifications, I used the certification training offered by A Cloud Guru which were very good. However, in the meantime I have also completed some trainings on the Udemy Platform by Stephane Maarek on miscellaneous topics related to Amazon Web Services (CloudFormation Masterclass, AWS Lambda and the Serverless Framework) that I also like very much, so I decided to start my certification course with Stephane’s wonderful “Ultimate AWS Certified SysOps Administrator Associate 2019” (quite a mouthful, but it’s a 17 hour journey).
AWS re:Invent 2018 – mostly serverless
Where we’re going, we don’t need servers!
One of the best soundbites I read during the week of re:Invent, was on the t-shirts of one of the vendors right there … (see the title of this section). I cannot recall which one it was and I must confess that I just put all my summer clothing in storage. Anyway, going serverless is one of the big trends and for me, as an integration consultant, this is the most natural fit for moving into the cloud,
AWS re:Invent 2018 – new launches
The Event
In the first part of my blog I commented on Las Vegas itself and my experiences travelling to Las Vegas; in this part I will describe my experience with the actual reason for travelling to Las Vegas, viz. attending the AWS re:Invent 2018 event.
As you may already know, re:Invent is the yearly conference from Amazon Web Services (AWS), this time its seventh edition, held in Las Vegas, for customers, partners and vendors from the AWS ecosystem. The size of re:Invent is astonishing, during this conference (starting on Monday and ending around noon on Friday), it welcomes 53,000 participans, spread across 7 different venues for a total of over 2,200 sessions of content.
Las Vegas, or Tinseltown in the Desert
Sin City
This weekend I have returned from visiting my first AWS re:Invent. This seventh edition was (as always) held in Las Vegas – aka Sin City as it caters for the many vices people may pursue. Needless to say I have been a good boy and only indulged into drinking a few beers …
Viva Las Vegas!
Inleiding
Na meerdere keren Oracle Open World te hebben bezocht, ben ik sinds een aantal jaren steeds meer geïnteresseerd geraakt in de Cloud-technologie. Niet primair SaaS-oplossingen waarbij een software-pakket op een cloud-omgeving draait, maar vooral in PaaS-oplossingen waarbij ontwikkel-platformen en applicatiecomponenten beschikbaar worden ‘bevrijd’ uit het knellende en gedateerde korset van de bedrijfsdatacentra en worden afgenomen uit het cloud-nirvana. En waar kun je je dan beter in verdiepen dan in de oplossingen van de marktleider op het gebied van infrastructuur- (IaaS) en platformoplossen (PaaS) in de cloud, Amazon Web Services?
Guys don’t want to commit
Last week I hosted a “Bits & Bites” session on Serverless for my colleagues and some guests. These sessions are centered around some new technology, concepts or functionality (the “Bits” part) that one of us has encountered in his daily work and wants to spread the word. These events are great fun as you get to meet with your colleagues who frequently are working some other project and have dinner together (the “Bites” part) – these kinds of extras make the work fun!
As always, I had prepared way too much material for the hands-on exercises. To the positive side, nobody can claim that they finished early and had to keep themselves busy.