Important
We recommend updating to the latest agent version as soon as it's available. If your organization has established practices that prevent you from updating to the latest version, ensure that your agents are regularly updated to a version that's at most 90 days old. Read more about keeping your agent up to date.
3.31.0
Added
- Integration packages to instrument AI model invocations (see below).
- New package nrawsbedrock v1.0.0 introduced to instrument calls to Amazon Bedrock Runtime Client API
InvokeModel
andInvokeModelWithResponseStream
calls. Also provides a simple one-step method which invokes stream invocations and harvests the response stream data for you. - New package nropenai v1.0.0 introduced to instrument calls to OpenAI using
NRCreateChatCompletion
,NRCreateChatCompletionStream
, andNRCreateEmbedding
calls.
- New package nrawsbedrock v1.0.0 introduced to instrument calls to Amazon Bedrock Runtime Client API
- Dockerfile in the
examples/server
sample app which facilitates the easy creation of a containerized ready-to-run sample app for situations where that makes testing easier.
Fixed
.Ignore
was not ignoring transaction. Fixes Issue #845.- Added nil error check in wrap function. Fixes Issue #862.
WrapBackgroundCore
background logger was not sending logs to New Relic. Fixes Issue #859.- Corrected pgx5 integration example which caused a race condition. Thanks to @WillAbides! Fixes Issue #855.
- Updated third-party library versions due to reported security or other supportability issues:
github.com/jackc/pgx/v5
to 5.5.4 innrpgx5
integrationgoogle.gopang.org/protobuf
to 1.33.0 innrmicro
andnrgrpc
integrationsgithub.com/jackc/pgx/v4
to 4.18.2 innrpgx
integration
AI Monitoring Configuration
New configuration options are available specific to AI monitoring. These settings include:
AIMonitoring.Enabled
, configured viaConfigAIMonitoring.Enabled(
bool)
[defaultfalse
]AIMonitoring.Streaming.Enabled
, configured viaConfigAIMonitoringStreamingEnabled(
bool)
[defaulttrue
]AIMonitoring.Content.Enabled
, configured viaConfigAIMonitoringContentEnabled(
bool)
[defaulttrue
]
AI Monitoring Public API Methods
Two new AI monitoring related public API methods have been added, as methods of the newrelic.Application
value returned by newrelic.NewApplication
:
AI Monitoring
New Relic AI monitoring is the industry’s first APM solution that provides end-to-end visibility for AI Large Language Model (LLM) applications. It enables end-to-end visibility into the key components of an AI LLM application. With AI monitoring, users can monitor, alert, and debug AI-powered applications for reliability, latency, performance, security and cost. AI monitoring also enables AI/LLM specific insights (metrics, events, logs and traces) which can easily integrate to build advanced guardrails for enterprise security, privacy and compliance.
AI monitoring offers custom-built insights and tracing for the complete lifecycle of an LLM’s prompts and responses, from raw user input to repaired/polished responses. AI monitoring provides built-in integrations with popular LLMs and components of the AI development stack. This release provides instrumentation for OpenAI and Bedrock.
When AI monitoring is enabled with ConfigAIMonitoringEnabled(true)
, the agent will now capture AI LLM related data. This data will be visible under a new APM tab called AI Responses. See our AI Monitoring documentation for more details.
Support statement
We use the latest version of the Go language. At minimum, you should be using no version of Go older than what is supported by the Go team themselves. See the Go agent EOL Policy for details about supported versions of the Go agent and third-party components.