Building Async and Cloud Native organizations - Issue #19
Workspaces in Azure API Management, Semantic Kernel and GitHub Copilot X announcements!
Welcome to my weekly newsletter! Every week, I bring you the latest news, updates, and resources from the world of coding and architecture. I'm so glad you've decided to join me, and I can't wait to share my insights and expertise with you.
I hope you'll find this newsletter to be a valuable resource, and I welcome your feedback and suggestions. If there's something you'd like to see more of, or if you have any questions or comments, please don't hesitate to reach out to me.
Thank you for joining me, and happy reading!
REST and APIs
An exciting announcement for Azure API Management; workspaces! Having a dedicated API gateway for each team does not make sense, so this is a typical service you deploy as a shared offering in your organization. Exposing it via an api.domain.com endpoint, manage the service itself, handle certificates, and in the end forwards the calls to the underlying applications.
You do need to manage the API definitions somehow, and if different teams own them, what do you do? Give them contributor access or have a dedicated team take ownership?
The first option provides flexibility and allows teams to be more independent. However, they can potentially alter other APIs as well. And next to API definitions, you also can touch named values, products, groups etc. Potentially breaking other APIs. So how do you get that separation?
With Workspaces; these allow teams to use API Management to manage and access their APIs separately, and independently of managing the service infrastructure.
The team that operates the APIM instance create workspaces and assign, using RBAC, permissions to teams to use those.
Example of a workspace for a feature team
The workspace members can develop, publish and maintain the APIs in their workspace only. Including setting up products, policies, the subscriptions as well as the settings (named values).
There is also a new level for the policy evaluation when using APIs in a workspace:
All APIs (service) > All APIs (workspace) > Product > API > API operation
I believe this is a significant step forward as it turns this shared service into something that has clearly defined roles; a place where the APIM service is managed and a place where the individual teams manages their own APIs.
Read more on the Azure blog:
Microsoft announced Kiota, a code generator for HTTP APIs. If you are fed up by the different API implementations, or using pure HTTP calls but wonder how to handle things like retries then this tool can be a great fit.
Kiota is a command line tool for generating an API client to call any OpenAPI described API you are interested in. The goal is to eliminate the need to take dependency on a different API SDK for every API that you need to call. Kiota API clients provide a strongly typed experience with all the features you expect from a high quality API SDK, but without having to learn a new library for every HTTP API.
It is hard not to talk about AI nowadays, but Microsoft released Semantic Kernel. It allows you to build AI-first apps. On their GitHub site, you can find some examples showing how to create a book using AI or get a summary of a chat conversation.
Are you still asking for a username and password on your site? That is pretty oldskool, as you can use biometrics! This blog tells you how to implement biometric authentication.
Although available for some while in a public beta, the Project Roadmap feature is finally released. It also comes with some improvements as well.
And without a doubt the biggest announcement GitHub made last week; GitHub Copilot X. Although GitHub Copilot has been out for two years now, it is a heavily used tool with integrations into the major IDEs.
With Copilot X, the engine will now run on ChatGPT 4, an upgrade from the OpenAI Codex/ChatGPT 3 model. Extending it to support chat and voice, as well to Pull Requests, command line, and documentation.
With GitHub Copilot Voice you can use your voice to verbally give natural language prompts.
The Copilot for Pull Requests adds the OpenAI GPT4 model to your pull requests. It can add meaningful descriptions to your PR, but also tell you where you might need to add tests. It can even generate those tests for you!
And what if somebody submits an issue? You can use this AI to see how to tackle the problem, even let it create a Pull Request for you.
Copolit for PR creating a PR for an issue
If you are reviewing a PR, would it not be nice if the AI reviews along with you? Making suggestions or even resolving some of the issues?
Not all of this is available now, but it shows a glimpse of what will be possible soon.
With GitHub Copilot for Docs, you get a chat interface to provide you AI-generated responses to questions about documentation—including questions developers have about the languages, frameworks, and technologies they’re using.
Also announced is Copilot for the command line interface to help you in the terminal. Providing help and command to get you going.
So some interesting announcements, all building on top of OpenAI ChatGPT 4 which already showed huge potential.
Read more on the GitHub blog and signup for the beta’s
Computing in general
Did you know that there is now a search HTML element? You might wonder why you need this, as you could already set a role and an input type.
Read this blog why it is important to have this element as well:
Cloud systems economics often overlook short-term peak-to-average scaling. Understanding this gap is vital for large-scale cloud systems compared to single-tenant systems. Multi-tenancy reduces the peak-to-average ratio, aligning costs with value and enabling higher peaks for individual workloads, contributing to scalability.
Buy or build, it is always a tricky question. But is it really if you consider you are never going to be always right?
Helpers and utilities
Want to create a GitHub profile, the one that shows up when people go to your username? With ProfileMe you get the raw markdown after filling in some fields.
This statement humorously highlights one of the challenges of distributed systems. In a distributed system, multiple interconnected computers work together, often relying on each other to perform tasks or maintain availability. The statement implies that the failure of one computer in the system, even if unknown to the user, can impact the overall functionality of the system, making their computer unusable. While this is an exaggerated description, it does emphasize the importance of understanding the dependencies, fault tolerance, and resilience of distributed systems to prevent failures from cascading through the network.
I hope you've enjoyed this week's issue of my newsletter. If you found it helpful, I invite you to share it with your friends and colleagues. And if you're not already a subscriber, be sure to sign up to receive future issues.
Next week, I'll be back with more articles, tutorials, and resources to help you stay up-to-date on the latest developments in coding and architecture. In the meantime, keep learning and growing, and happy coding!
Best regards, Michiel