Get insights from multimodal content with Amazon Bedrock Data Automation, now generally available

Many applications need to interact with content available through different modalities. Some of these applications process complex documents, such as insurance claims and medical bills. Mobile apps need to analyze user-generated media. Organizations need to build a semantic index on top of their digital assets that include documents, images, audio, and video files. However, getting insights from unstructured multimodal content is not easy to set up: you have to implement processing pipelines for the different data formats and go through multiple steps to get the information you need. That usually means having multiple models in production for which you have to handle cost optimizations (through fine-tuning and prompt engineering), safeguards (for example, against hallucinations), integrations with the target applications (including…

AWS Weekly Roundup: Anthropic Claude 3.7, JAWS Days, cross-account access, and more (March 3, 2025)

I have fond memories of the time I built an application live at the AWS GenAI Loft London last September. AWS GenAI Lofts are back in locations such as San Francisco, Berlin, and more, to continue providing collaborative spaces and immersive experiences for startups and developers. Find a loft near you for hands-on access to AI products and services, events, workshops, and networking opportunities, that you can’t miss! Last week’s launches Here are some launches that got my attention during the previous week. Four ways to grant cross-account access in AWS — For some situations, you might want to enable centralized operations across multiple AWS accounts or share resources across teams, or projects within your teams. In these cases, you…