SM-Ai: Audio Metadata AI Processor
Overview
SM-Ai is a macOS application designed to enhance and standardize audio file metadata using artificial intelligence. The application processes audio metadata from Soundminer databases, leveraging large language models to analyze, categorize, and reformat metadata.
Key Features
AI-Powered Metadata Processing: Uses OpenAI GPT models, Anthropic Claude models, or local LLMs to intelligently process and enhance audio metadata
UCS Category Intelligence: Built-in Universal Category System support with 850+ predefined categories for accurate sound classification
Flexible Output: Write AI-generated content to any metadata field with options to append, prepend, or replace existing content
Batch Processing: Handle multiple files efficiently through Soundminer's batch processing capabilities
Custom Prompts: Create and save custom prompt templates for different metadata enhancement workflows
How It Works
Select audio files in Soundminer
Run the included Lua script to send metadata to SM-Ai
SM-Ai processes the metadata through your chosen AI model
Enhanced metadata is written back to your specified field in Soundminer
System Requirements
macOS 13.5 or later
Soundminer V4.5Pro, V5Pro, V6 Pro or V6 Plus
Internet connection (for cloud AI providers) or local LLM setup
API Setup
SM-Ai supports three types of AI providers: OpenAI, Anthropic Claude, and local LLMs via Ollama.
OpenAI Setup
OpenAI provides the GPT family of models (GPT-4o, GPT-4o-mini, GPT-3.5-turbo).
Create an Account
Visit: https://platform.openai.com/signupGenerate an API Key
Log in to your account
Navigate to: https://platform.openai.com/api-keys
Click "Create new secret key"
Give it a descriptive name (e.g., "SM-Ai Metadata Processing")
Copy the key immediately (you won't be able to see it again)
Add Billing Information
Add a payment method
Consider setting usage limits to control costs
Enter Key in SM-Ai
Open SM-Ai Settings (⌘,)
Select an OpenAI model from the AI Provider dropdown
Paste your API key in the API Key field
Anthropic Claude Setup
Anthropic provides the Claude family of models (Claude Opus, Sonnet, and Haiku).
Create an Account
Visit: https://console.anthropic.com/signupGenerate an API Key
Log in to the Anthropic Console
Navigate to: https://console.anthropic.com/settings/keys
Click "Create Key"
Give it a descriptive name (e.g., "SM-Ai")
Copy the key immediately
Add Credits
Purchase API credits (minimum $5)
Claude API uses prepaid credits rather than monthly billing
Enter Key in SM-Ai
Open SM-Ai Settings (⌘,)
Select a Claude model from the AI Provider dropdown
Paste your API key in the API Key field
Local LLM Setup (Ollama)
For offline processing and/or cost savings, you can use loc al language models through Ollama.
Download and Install Ollama
Visit: https://ollama.com/downloadInstall a Model
You can install a new model using either Terminal or the GUI chat window.Terminal mode: ollama pull llama3.1
GUI mode: Choose a model in Ollama’s chat app and it will download and install the chosen model once you initiate a chat (ie. type “hello”)
Start Ollama Server
Ollama runs automatically in the background after installation. You can verify it's running by type into Terminal:
ollama listConfigure SM-Ai
Open SM-Ai Settings (⌘,)
Select "Local LLM" from the AI Provider dropdown
Chose from downloaded models
Verify the Base URL is: http://localhost:11434/v1/chat/completions
Note: Local LLMs require significant system resources. For best performance:
Apple Silicon Macs (M1/M2/M3) with 16GB+ RAM recommended
Larger models (70B+) require 32GB+ RAM
Processing will be slower than cloud APIs but is free and private
Cost Considerations
Cloud AI Pricing (Approximate as of October, 2025)
OpenAI:
GPT-4o: ~$2.50 per 1,000 prompts (typical metadata processing)
GPT-4o-mini: ~$0.15 per 1,000 prompts
GPT-3.5-turbo: ~$0.50 per 1,000 prompts
Anthropic:
Claude Opus 4.1: ~$15 per 1,000 prompts
Claude Sonnet 4/4.5: ~$3 per 1,000 prompts
Claude Haiku 3.5: ~$0.80 per 1,000 prompts
Local LLM:
Free (no per-use cost)
Actual costs vary based on prompt length and model responses. Monitor your usage through the respective provider dashboards.
Getting Started
Install SM-Ai: Launch the application and complete the initial setup
Configure API: Follow the API setup instructions above for your chosen provider
Install Soundminer Script: Copy the included Send_to_SM-Ai.lua script to your Soundminer Scripts folder (typically ~/Library/Application Support/Soundminer/Scripts/)
Start Processing: Select files in Soundminer and run the script
Learning Prompt Engineering
Effective prompt engineering is key to getting the best results from AI models. These resources will help you understand how to craft better prompts:
General Prompt Engineering
OpenAI Prompt Engineering Guide
https://platform.openai.com/docs/guides/prompt-engineering
Comprehensive guide covering strategies and tactics for better promptsAnthropic Prompt Engineering Tutorial
https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/overview
Claude-specific techniques and best practicesLearn Prompting (Community Resource)
https://learnprompting.org/
Free, comprehensive course on prompt engineering fundamentals
Advanced Techniques
Prompt Engineering Guide by DAIR.AI
https://www.promptingguide.ai/
In-depth coverage of advanced techniques and researchAnthropic's "How to build with Claude" Series
https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/chain-prompts
Advanced topics including chain-of-thought and complex reasoning
Key Concepts to Learn
Clear Instructions: Be explicit about what you want
Context Provision: Give the AI relevant background information
Examples: Show the AI what good output looks like
Output Format: Specify exactly how you want results structured
Iterative Refinement: Start simple and progressively improve your prompts
System Messages: Use system prompts to set consistent behavior
Tips for Best Results
Be Specific: The more detailed your prompts, the better the results
Provide Context: Include relevant information about your audio library's organization
Use Examples: Show the AI examples of good descriptions from your library
Test First: Process a small batch before running on your entire library
Monitor Costs: Check your API usage regularly, especially when starting out
Backup: Always backup your database before batch processing
Privacy & Data
Cloud AI: Your metadata is sent to OpenAI or Anthropic servers for processing. Review their privacy policies before use.
Local LLM: All processing happens on your computer; no data is sent externally.
SM-Ai: Does not collect or transmit any user data
