To understand and contrast two fundamental models of data communication in the cloud: the request-response model of APIs and the continuous flow model of stream processing. This lab will demonstrate how to interact with data-at-rest using a standard REST API and how to observe and query data-in-motion using a managed event streaming platform.
This lab brings to life the concepts from Chapter 9: What Data is Flowing in Right Now?. So far, we have mostly treated data as something you store and then query—data-at-rest. Now, we will explore data-in-motion, treating data as a continuous stream of events that are processed as they occur. You will need your GitHub account for this lab.
A REST API is like ordering from a restaurant menu. You make a specific request (e.g., "I'd like the burger"), send it to the kitchen (the server), and wait for a specific response (your burger arrives). You only get data when you ask for it.
We will use the free JSONPlaceholder API, which provides sample data for testing.
Open a new browser tab.
In the address bar, paste the following URL and press Enter:
https://jsonplaceholder.typicode.com/users/1
1. It should look something like this:```json
{
"id": 1,
"name": "Leanne Graham",
"username": "Bret",
"email": "[email protected]",
"address": {
"street": "Kulas Light",
"suite": "Apt. 556",
"city": "Gwenborough",
"zipcode": "92998-3874",
"geo": {
"lat": "-37.3159",
"lng": "81.1496"
}
},
"phone": "1-770-736-8031 x56442",
"website": "hildegard.org",
"company": {
"name": "Romaguera-Crona",
"catchPhrase": "Multi-layered client-server neural-net",
"bs": "harness real-time e-markets"
}
}
```
2, you must make a new request by changing the URL to .../users/2. You have to "pull" the data each time you want it. Try that.9 and take a screenshot for submission. This is Screenshot A.Event streaming is like a live news broadcast. The TV station (the producer) continuously sends out a signal (a stream of events), and you, the viewer (the consumer), can tune in at any time to see what's happening right now. The data is "pushed" to you continuously.
We will use Confluent Cloud, a fully managed service for Apache Kafka, the industry standard for event streaming.