8.6 C
New York
Monday, November 25, 2024

Hacking our way to better team meetings


Since this blog was originally published, I have also published the Distill CLI. You can read the follow-up. blog post or play with it code on GitHub.


As someone who takes a lot of notes, I’m always looking for tools and strategies that can help me refine my own note-taking process (like the Cornell Method). And while I generally prefer pencil and paper (because it’s been shown to help with retention and synthesis), there’s no denying that technology can help enhance our developed skills. This is especially true in situations like meetings, where actively participating and taking notes at the same time can conflict with each other. The distraction of looking down to take notes or tapping away at the keyboard can make it difficult to stay engaged in the conversation, as it forces us to make quick decisions about which details are important, and there is always the risk of missing important details while trying to capture the previous ones. Not to mention that when we are faced with back-to-back meetings, the challenge of summarizing and extracting important details from pages of notes becomes more complicated, and when considered at the group level, there are Significant loss of individual and group time. in modern businesses with this type of administrative expenses.

Faced with these issues on a daily basis, my team (a small tiger team I like to call OCTO (Office of the CTO)) saw an opportunity to use AI to augment our team meetings. They’ve developed a simple, straightforward proof of concept for us, using AWS services like Lambda, Transcribe, and Bedrock to transcribe and summarize our virtual team meetings. It allows us to collect notes from our meetings, but stay focused on the conversation itself, as granular details of the discussion are automatically captured (it even creates a to-do list). And today, we’re open source the tool, which our team calls “Distill,” in the hopes that others might find it useful too: https://github.com/aws-samples/amazon-bedrock-audio-summarizer.

In this post, I’ll walk you through the high-level architecture of our project, how it works, and give you a preview of how I’ve been working alongside Amazon Q Developer to turn Distill into a Rust CLI.

The Anatomy of a Simple Audio Summarization App

The app itself is simple, and this is intentional. I subscribe to the idea that systems should be as simple as possible, but not simpler. First, we upload an audio file of our meeting to an S3 bucket. An S3 trigger then notifies a Lambda function, which starts the transcription process. An Event Bridge rule is used to automatically invoke a second Lambda function when any Transcribe job that starts with summarizer- has a recently updated status of COMPLETED. Once the transcription is complete, this Lambda function takes the transcript and sends it with an instruction to Bedrock to create a summary. In our case, we used Claude 3 Sonnet for inference, but you can adapt the code to use any model available in Bedrock. When the inference is complete, the summary of our meeting, including high-level takeaways and to-dos, is stored back in our S3 bucket.

Distilled architecture diagram

I have spoken many times about the importance of treating infrastructure as code, and as such, we have used AWS CDK to manage the infrastructure for this project. The CDK gives us a reliable and consistent way to deploy resources and ensure that the infrastructure is shared by anyone. Beyond that, it also gave us a good way to quickly iterate on our ideas.

Using distill

If you try this (and I hope you do), setup is quick. Clone the repositoryand follow the steps in the README file to deploy the application infrastructure to your account using the CDK. After that, there are two ways to use the tool:

  1. Drop an audio file directly into the source S3 bucket folder created for you, wait a few minutes and then see the results in the processed file.
  2. Use the Jupyter notebook we created to walk through the audio upload process, monitor the transcription, and retrieve the audio summary.

Below is an example output (minimally sanitized) from a recent OCTO team meeting that only part of the team was able to attend:

Here is a summary of the conversation in readable paragraphs:

The group discussed possible content ideas and approaches for upcoming events such as VivaTech and re:Invent. There were suggestions about keynote talks instead of having informal talks or panel discussions. The importance of developing thought-provoking upcoming events was emphasized.

Recapping Werner’s recent tour of Asia, the team reflected on highlights such as interacting with university students, developers, startups, and local underserved communities. Indonesia’s initiatives around disability inclusion were praised. Helpful feedback was shared on logistics, how to balance work with downtime, and optimal event formats for Werner. The group plans to investigate how to turn these learnings into an internal newsletter.

Other topics covered included upcoming advisory meetings, which Jeff could attend virtually, and the changing role of the modern CTO with an increased focus on social impact and global perspectives.

Key action items:

  • Reschedule team meeting for next week
  • Lisa will circulate the agenda for the next consultative meeting when it is available
  • Roger will draft possible panel questions for VivaTech
  • Explore recording/streaming options for the VivaTech panel
  • Determine ownership of content between teams to summarize highlights from the Asia tour.

What’s more, the team has created a Slack webhook that automatically posts these summaries to a team channel, so those who couldn’t attend can catch up on what was discussed and quickly review what actions to take.

Remember, AI is not perfect. Some of the summaries we receive, including the one above, have errors that require manual adjustment. But that’s okay, because it still speeds up our processes. It is simply a reminder that we still need to discern and engage in the process. Critical thinking is as important now as ever.

It is valuable to solve everyday problems

This is just one example of a simple application that can be built quickly, deployed to the cloud, and drive organizational efficiencies. Depending on which study you look at, around 30% of corporate employees say they don’t complete their tasks because they can’t remember key information from meetings. We can start reducing statistics like that by sending you personalized notes immediately after a meeting, or an assistant that automatically creates work items from a meeting and assigns them to the right person. It’s not always about solving the “big” problem all at once with technology. Sometimes it’s about solving everyday problems. Find simple solutions that become the basis for incremental and significant innovation.

I’m particularly interested in where this goes next. We now live in a world where an AI-powered robot can take your calls and act in real time. Take notes, answer questions, track tasks, remove PII, and even search for things that would have otherwise distracted and slowed down the call while a person tried to find the data. By sharing our simple app, the intention is not to show “something new and shiny”, but to show you that if we can build it, you can too. And I’m curious to see how the open source community will use it. How will they extend it? What they will create on top. And this is what I find really exciting: the potential for simple AI-based tools to help us in more and more ways. Not as substitutes for human ingenuity, but as aids that make us better.

To that end, working on this project with my team has inspired me to take on my own pet project: turning this tool into a Rust CLI.

Building a Rust CLI from scratch

I blame Marc Brooker and Colm MacCarthaigh for becoming a Rust enthusiast. I’m a systems programmer at heart, and that heart started beating much faster the more familiar I became with the language. And it became even more important to me after I found myself The wonderful research of Rui Pereira about the power, time and memory consumption of different programming languages, when I realized their tremendous potential to help us build more sustainably in the cloud.

During our experiments with Distill, we wanted to see what effect moving a function from Python to Rust would have. Using the CDK, it was easy to make a quick change to our stack that allowed us to move a Lambda function to the AL2023 runtime and then deploy a Rust-based version of the code. If you’re curious, the feature averaged cold starts that were 12 times faster (34 ms vs. 410 ms) and used 73% less memory (21 MB vs. 79 MB) than its Python variant. Inspired, I decided to get my hands really dirty. I was going to turn this project into a command line utility and put some of what I learned in “” by Ken Youens-Clark.Command line rust” into practice.

I have always loved working from the command line. Each grep, catand curl In that little black box it reminds me a lot of driving an old car. It may take a little longer to turn, it may make some noises and complain, but you feel a connection with the machine. And being active with code, like taking notes, helps things stick.

Not being a Rust guru, I decided to put Q to the test. I still have a lot of questions about the language, idioms, ownership model, and common libraries I’ve seen in sample code, like Tokyo. If I’m honest, learning to interpret what the compiler is opposing is probably the hardest part of programming in Rust for me. With open Q in my IDE, it was easy to ask “stupid” questions without stigma, and using the references it provided meant I didn’t have to search through a ton of documentation.

Tokyo Overview

As the CLI began to take shape, Q played a larger role, providing deeper insights that informed coding and design decisions. For example, I was curious if using sector references would introduce inefficiencies in large lists of items. Q quickly explained that while slicing arrays might be more efficient than creating new arrays, there is potential for performance impacts at scale. It felt like a conversation: I could exchange ideas with Q, ask follow-up questions freely, and receive immediate, non-judgmental responses.

Q Tips on Hacks in Rust

The last thing I’ll mention is the feature to submit code directly to Q. I’ve been experimenting with code refactoring and optimization, and it’s helped me understand Rust better and pushed me to think more critically about code. I have written. This shows how important it is to create tools that reach builders where they are already comfortable: in my case, the IDE.

Send code to Q

Soon…

In the next few weeks, the plan is to share my code for my Rust CLI. I need a little time to polish this up and have it reviewed by people with a little more experience, but here’s a preview:

A preview of the Rust CLI

As always, now go build! And get your hands dirty while you’re at it.

Related Articles

Latest Articles