Notes
This release of the Python agent adds a flag for the Azure init container operator.
Install the agent using easy_install/pip/distribute
via the Python Package Index or download it directly from the New Relic download site.
New features
Add Azure Init Container Operator Flag
- New Relic can be integrated into applications hosted on Microsoft Azure, either directly, through an init container for Azure Container Apps, or a pre-build startup script for Microsoft Azure App Service. You can flag that this is the installation method used by setting the
NEW_RELIC_AZURE_OPERATOR_ENABLED
environment variable.
- New Relic can be integrated into applications hosted on Microsoft Azure, either directly, through an init container for Azure Container Apps, or a pre-build startup script for Microsoft Azure App Service. You can flag that this is the installation method used by setting the
Support statement
We recommend updating to the latest agent version as soon as it's available. If you can't upgrade to the latest version, update your agents to a version no more than 90 days old. Read more about keeping agents up to date.
See the New Relic Python agent EOL policy for information about agent releases and support dates.
Notes
This release of the Python agent adds support for Python 3.13, adds a new Large Language Model (LLM) API, supports Docker ID parsing in Amazon ECS Fargate environments, and instruments a new Langchain vectorstore.
Install the agent using easy_install/pip/distribute
via the Python Package Index or download it directly from the New Relic download site.
New features
Add support for Python 3.13
- The agent now supports applications running in Python 3.13.
Add new LLM custom attribute context manager API
- The agent now includes a new context manager API that adds custom attributes to LLM events generated from calls to LLMs in application code. For more information on usage, please see our API documentation.
Add support for reporting of Amazon ECS Fargate Docker IDs
- The agent now reports Docker IDs for containers running in ECS Fargate environments.
Add instrumentation for
SQLiteVec
vectorstore in Langchain- The agent now instruments
similarity_search
for theSQLiteVec
vectorstore.
- The agent now instruments
Support statement
We recommend updating to the latest agent version as soon as it's available. If you can't upgrade to the latest version, update your agents to a version no more than 90 days old. Read more about keeping agents up to date.
See the New Relic Python agent EOL policy for information about agent releases and support dates.
Notes
This release of the Python agent drops support for Python 2.7 and adds the following:
- a Kafka server metric
- host and port attributes for memcache
- support for kafka-python-ng
- support for uvicorn_worker
- an environment variable for garbage collector (GC) runtime metrics
- automatic detection for the function signature in LangChain's
similarity_search
- updated support for OpenAI's chat class
- a fix for a bug in gRPC's entity name detection
- a fix for transaction context propagation loss in LangChain in the case of a new thread
- parsing of the request and response when running OpenAI via LangChain
- a fix for pyscopg v3 API incompatibilities
- a fix that removes versioned logic in loguru instrumentation
Install the agent using easy_install/pip/distribute
via the Python Package Index or download it directly from the New Relic download site.
Deprecations
Remove Python 2.7 Support
New features
Add Kafka server metric
- Adds a metric to capture the host and port information for Kafka entities on both consumers and producers. This metric allows the UI to link AWS information with AWS MSK entities in the service map.
Add capturing of memcache
host
andport
- Adds capturing of
host
andport
info on the following memcache libraries: bmemcached, aiomcache, and pymemcache. This allows the UI to link AWS information with AWS memcache entities in the service map.
- Adds capturing of
Add support for kafka-python-ng
- kafka-python has been released under a new name: kafka-python-ng. This continues to support kafka-python under the new package name.
Add support for uvicorn_worker
uvicorn.workers
has been moved to a separate package calleduvicorn_worker
. This checks for both names of the module when reporting dispatcher information.
Automatically detect function signature in LangChain
similarity_search
- Automatically detect the function signature when wrapping
similarity_search
.
- Automatically detect the function signature when wrapping
Add environment variable for garbage collector runtime metrics
- Garbage collector runtime metrics can now be enabled/disabled via the following environment variable:
NEW_RELIC_GC_RUNTIME_METRICS_ENABLED
.
- Garbage collector runtime metrics can now be enabled/disabled via the following environment variable:
Bug fixes
Fix bug in gRPC entity name detection
- Previously, the gRPC channel entity name was missing the first character(s) in specific cases. This bug has been fixed.
Fix issue in LangChain where a thread is started and the transaction context is lost
- Previously, when LangChain called certain chains, such as retrieval chains, LangChain started a thread that caused the transaction to be lost, resulting in broken instrumentation. This has been fixed so the transaction context is passed across threads and instrumentation still works for retrieval chains.
Fix parsing of request and response when running OpenAI via LangChain
Instrument new path to OpenAI chat completions class
- OpenAI moved the
ChatCompletions
class to a different path which prevented the instrumentation from being applied. This has been fixed.
- OpenAI moved the
Fix pyscopg v3 API incompatibilities
- Expand DBAPI2 wrappers to allow arbitrary
kwargs
onexecutemany()
. Upgrade psycopg v3 instrumentation to allow arbitrarykwargs
onexecutemany()
. Rename existing psycopg v3 wrapper arguments to match the upstream library's names for compatibility.
- Expand DBAPI2 wrappers to allow arbitrary
Remove versioned logic in loguru instrumentation
- Removed versioned logic in loguru instrumentation to fix a bug that occurred when the version was undeterminable.
Support statement
We recommend updating to the latest agent version as soon as it's available. If you can't upgrade to the latest version, update your agents to a version no more than 90 days old. Read more about keeping agents up to date.
See the New Relic Python agent EOL policy for information about agent releases and support dates.
Notes
This release of the Python agent adds the following:
- Support for boto3's
upload_file
command - Support for newly added Redis commands
- AWS SQS information on spans
server.address
attribute on RabbitMQ
This release also fixes an issue with OpenAI's client instrumentation and missing parameters in Botocore SQS.
Install the agent using easy_install/pip/distribute
via the Python Package Index or download it directly from the New Relic download site.
New features
Add instrumentation for
upload_file
in boto3- Add instrumentation to the s3transfer
BoundExecutor
submit function, allowing the agent to support theupload_file
operation in boto3.
- Add instrumentation to the s3transfer
Add support for new Redis commands
- Add instrumentation for the following commands in Redis:
- hexpire
- hexpireat
- hexpiretime
- hpersist
- hpexpire
- hpexpireat
- hpexpiretime
- hpttl
- httl
- Add instrumentation for the following commands in Redis:
Capture AWS SQS information on message spans:
- The Python Agent now captures the following AWS SQS information on message spans, allowing the UI to link AWS information with AWS SQS:
messaging.system
- aws_sqscloud.region
- AWS regioncloud.account.id
- AWS account IDmessaging.destination.name
- AWS queue name
- The Python Agent now captures the following AWS SQS information on message spans, allowing the UI to link AWS information with AWS SQS:
Add
server.address
attribute to RabbitMQ:- Add
server.address
attribute to RabbitMQ, allowing the UI to link AWS information with AWS MQ.
- Add
Bug fixes
Fix TypeError in OpenAI instrumentation:
- Newer versions of OpenAI include an additional argument, resulting in an error with the base client instrumentation. This has been fixed.
Fix missing parameters in
MessageTrace
:- Resolved issue where parameters were not being passed from the
MessageTraceWrapper
to theMessageTrace
.
- Resolved issue where parameters were not being passed from the
Support statement
We recommend updating to the latest agent version as soon as it's available. If you can't upgrade to the latest version, update your agents to a version no more than 90 days old. Read more about keeping agents up to date.
See the New Relic Python agent EOL policy for information about agent releases and support dates.
Notes
This release of the Python agent adds instrumentation for aiomcache, support for an internally set collect_ai
serverside configuration setting, formatting stack traces in NewRelicContextFormatter
, fixes a crash in package catpuring in Python 2.7, an attribute check in OpenAI instrumentation, and casing on LangChain metric and span names.
Install the agent using easy_install/pip/distribute
via the Python Package Index or download it directly from the New Relic download site.
New features
Add instrumentation for aiomcache:
- Add instrumentation support for aiomcache. Thank you Jair Henrique for the contribution!
Add support for
collect_ai
serverside config:- Add support for an internally set
collect_ai
serverside configuration setting. Note this is returned in the connect response when a particular feature flag is set on an account.
- Add support for an internally set
Adds support for formatting stack traces in
NewRelicContextFormatter
:NewRelicContextFormatter
now implements theformat_exc_info
class method that formats stack traces. Thank you Daniel Fritz for the contribution!
Bug fixes
Fix crash in package capture in Python 2.7:
- Previously, when capturing packages on Python 2.7 an
UnboundLocalError
would be raised. This has been fixed.
- Previously, when capturing packages on Python 2.7 an
Fix attribute check in OpenAI:
- Fix a typo in an attribute check inside the OpenAI instrumentation. Thank you Liam Niehus-Staab for the contribution!
Fix casing on LangChain metric and span names:
- Previously, if LangChain was being used in combination with an unsupported Large Language Model library (aka not Bedrock or OpenAI) then the AI dashboard tab would not appear in the UI because the metric name was not cased correctly. This has been fixed and the Span names have been updated to match the casing of the metric name.
Support statement
We recommend updating to the latest agent version as soon as it's available. If you can't upgrade to the latest version, update your agents to a version no more than 90 days old. Read more about keeping agents up to date.
See the New Relic Python agent EOL policy for information about agent releases and support dates.
Notes
This release of the Python agent adds agent_language to lambda metadata, support for injecting the agent into Kubernetes, support for psycopg 3.0+, optimizes plugins list capturing, fixes the Large Language Model event duration units, a crash in ASGI when the Content-Length header is missing, a crash when using OpenAI's .with_raw_response.
and .with_streaming_response.
.
Install the agent using easy_install/pip/distribute
via the Python Package Index or download it directly from the New Relic download site.
New features
Add
agent_language
to lambda metadata- Add
agent_language
to collected lambda metadata.
- Add
Optimize plugins list capturing
- Skip checking for a package version on newrelic hooks that we know do not have versions.
Add support for injecting the agent into Kubernetes
- Updates the bootstrap sitecustomize file to support injecting the agent into a Kubernetes cluster.
- A full product for agent injection in Kubernetes will be coming soon in public preview.
- A new informational only setting called
k8s_operator.enabled
(withNEW_RELIC_K8S_OPERATOR_ENABLED
as an environment variable) was added, which is used to report when the agent is injected into a Kubernetes cluster. This setting does not enable/disable this function of the agent.
Add support for psycopg 3.0+
- New instrumentation for psycopg 3.0+ has been added, providing database tracing for both the
Connection
andAsyncConnection
classes.
- New instrumentation for psycopg 3.0+ has been added, providing database tracing for both the
Bug fixes
Fix Large Language Model event duration units
- Previously, durations on LLM events were recorded in seconds which did not match some of the other language agents. This has been changed to be milliseconds.
Fix crash when using OpenAI's
.with_raw_response.
- Previously, an exception would be raised inside the instrumentation when
.with_raw_response.
was used. This no longer happens, the instrumentation successfuly records LLM data when.with_raw_response.
is used.
- Previously, an exception would be raised inside the instrumentation when
Fix crash when using OpenAI's
.with_streaming_response.
- Previously, an exception would be raised inside the instrumentation when
.with_streaming_response.
was used. This no longer happens, the instrumentation is just skipped.
- Previously, an exception would be raised inside the instrumentation when
Fix a crash in ASGI when the Content-Length header is missing
- Previously, an exception would be raised inside the instrumentation that injects the browser agent when an ASGI response was missing the Content-Length header. This issue has been fixed.
Support statement
We recommend updating to the latest agent version as soon as it's available. If you can't upgrade to the latest version, update your agents to a version no more than 90 days old. Read more about keeping agents up to date.
See the New Relic Python agent EOL policy for information about agent releases and support dates.
Notes
This release of the Python agent:
- Adds instrumentation for AIOBotocore
- Adds support for Meta Llama3 and Mistral AI in Amazon Bedrock
- Fixes an error parsing issue in OpenAI
- Updates Loguru's instrumentation to use milliseconds instead of seconds
Install the agent using easy_install/pip/distribute
via the Python Package Index or download it directly from the New Relic download site.
New features
Add AIOBotocore instrumentation
- Add instrumentation for AIOBotocore. Supports proxy mode in addition to standard operation.
Add new Amazon Bedrock models
- Add instrumentation for the following Amazon Bedrock models:
- Meta Llama3
- Mistral AI
- Add instrumentation for the following Amazon Bedrock models:
Bug fixes
Fix OpenAI error parsing
- Previously, if an error in OpenAI was encountered, the Python Agent would attempt to override the default
error.message
withinnotice_error()
with the one used in OpenAI. However, if there was nomessage
attribute in the error, the Python Agent would crash. Now it defaults to the original methodology of error handling by the Python Agent.
- Previously, if an error in OpenAI was encountered, the Python Agent would attempt to override the default
Record timing in Loguru in milliseconds
- Timing in Loguru is recorded as an integer and if that value was less than 1 second, it would record it as 0s. This changes the units to be in milliseconds instead. Thanks, (@julia-tadej-wwtech)[https://github.com/julia-tadej-wttech] for the contribution!
Support statement
We recommend updating to the latest agent version as soon as it's available. If you can't upgrade to the latest version, update your agents to a version no more than 90 days old. Read more about keeping agents up to date.
See the New Relic Python agent EOL policy for information about agent releases and support dates.
Notes
This release of the Python agent fixes Celery instrumentation on worker processes, adds new Langchain vectorstores, adds HTTP method attributes to urllib3 traces, and fixes an issue with URI detection in gRPC clients.
Install the agent using easy_install/pip/distribute
via the Python Package Index or download it directly from the New Relic download site.
New features
Add new Langchain vectorstores
- Support for the following Langchain vectorstores:
Relyt
,OracleVS
,UpstashVectorStore
,VLite
- Support for the following Langchain vectorstores:
Bug fixes
Fix Celery instrumentation on worker processes
- Instrumentation updates in v9.9.0 introduced a bug where Celery workers running with worker optimizations enabled would overwrite instrumentation. This has been fixed and instrumentation should now function the same with and without worker optimizations enabled.
Add HTTP method attributes to urllib3 traces
- urllib3 traces did not include the HTTP method as an attribute previously. This has now been added.
Fix gRPC URI detection for client ExternalTraces
- The latest version of gRPC changed the format of URIs used in clients, which caused the hostname to be reported as "dns" for all client requests. This has been fixed and hostnames should be reported correctly again.
Support statement
We recommend updating to the latest agent version as soon as it's available. If you can't upgrade to the latest version, update your agents to a version no more than 90 days old. Read more about keeping agents up to date.
See the New Relic Python agent EOL policy for information about agent releases and support dates.
Notes
This release of the Python agent updates Celery instrumentation, adds new Langchain vectorstores, adds configuration to capture memory usage runtime metrics per process, fixes a content reporting issue in Anthropic Claude, and upgrades internal version of urllib3 to v1.26.18.
Install the agent using easy_install/pip/distribute
via the Python Package Index or download it directly from the New Relic download site.
Security
Upgrade internal version of urllib3 to v1.26.18
- Upgrade the internal version of urllib3 used in the Python Agent to v1.26.18 to resolve security warnings.
New features
Update Celery instrumentation
- Add support for distributed tracing in Celery over AMQP headers.
- Remove duplicate function traces for some methods of running tasks.
- All tasks should now only be instrumented once, and will report either an OtherTransaction, or a FunctionTrace if run under an existing transaction.
- Fix instrumentation for grouped celery tasks APIs, such as
task.map()
,celery.group()
, andcelery.chunks()
.- Tasks run using
task.map()
ortask.starmap()
will now name tasks in the formatCelery/celery.map/my_task
to allow you to differentiate between map tasks. - Individual task runs created by the
map()
andstarmap()
tasks will be traced with a FunctionTrace to capture individual timings.
- Tasks run using
Add new Langchain vectorstores
- Support for the following Langchain vectorstores:
DuckDB
,EcloudESVectorStore
,InMemoryVectorStore
,PathwayVectorClient
,VDMS
- Support for the following Langchain vectorstores:
Add configuration setting to enable capturing of memory usage runtime metrics per process
- Add
memory_runtime_pid_metrics.enabled
configuration setting to toggle capturing of memory usage metrics (per process ID only) to reduce Metric Grouping Issues (MGI).
- Add
Bug fixes
Report only raw content dictionary for Anthropic Claude
- Previously, AWS Bedrock's Anthropic Claude model would report a list of dictionaries of message content. Now it reports a single dictionary of message content.
Support statement
We recommend updating to the latest agent version as soon as it's available. If you can't upgrade to the latest version, update your agents to a version no more than 90 days old. Read more about keeping agents up to date.
See the New Relic Python agent EOL policy for information about agent releases and support dates.
Notes
This release of the Python agent adds support for the latest versions of asgiref and support for AI monitoring when using the following libraries: OpenAI, AWS Bedrock, and Langchain.
Install the agent using easy_install/pip/distribute
via the Python Package Index or download it directly from the New Relic download site.
New features
Add support for asgiref 3.8.0 and above
- Asgiref released a new version that resulted in missing transaction context. This has been fixed.
AI monitoring
- New Relic AI monitoring is the industry’s first APM solution that provides end-to-end visibility for AI Large Language Model (LLM) applications. It enables end-to-end visibility into the key components of an AI LLM application. With AI monitoring, users can monitor, alert, and debug AI-powered applications for reliability, latency, performance, security and cost. AI monitoring also enables AI/LLM specific insights (metrics, events, logs and traces) which can easily integrate to build advanced guardrails for enterprise security, privacy and compliance.
- AI monitoring offers custom-built insights and tracing for the complete lifecycle of an LLM’s prompts and responses, from raw user input to repaired/polished responses. AI monitoring provides built-in integrations with popular LLMs and components of the AI development stack. This release provides instrumentation for OpenAI, AWS Bedrock, and Langchain.
- When AI monitoring is enabled with
ai_monitoring.enabled = true
, the agent will now capture AI LLM related data. This data will be visible under a new APM tab called AI Responses. See our AI Monitoring documentation for more details.
AI monitoring configuration
- New configuration options are available specific to AI monitoring. These settings include:
- ai_monitoring.enabled
- ai_monitoring.streaming.enabled
- ai_monitoring.content.enabled
- New configuration options are available specific to AI monitoring. These settings include:
AI monitoring public API methods
- Two new AI monitoring related public API methods have been added:
Add support for AWS Bedrock
- Support for AWS Bedrock Large Language Model instrumentation has been added. Chat completion and embedding data for streaming and non-streaming is recorded for the following models:
- amazon.titan-*
- ai21.j2-*
- anthropic.claude-*
- cohere.command-*
- meta.llama2-*
- amazon.titan-embed*
- cohere.embed*
- Support for AWS Bedrock Large Language Model instrumentation has been added. Chat completion and embedding data for streaming and non-streaming is recorded for the following models:
Add support for Langchain
- Support for Langchain Large Language Model instrumentation has been added. Langchain Agents, Chains, Tools, OpenAI, and Bedrock LLM data is recorded. Note streaming is not supported at this time.
Add support for OpenAI
- Support for OpenAI Large Language Model instrumentation has been added. Synchronous and asynchronous chat completion and embedding data is recorded. Note streaming is only supported in chat completions at this time.
Support statement
We recommend updating to the latest agent version as soon as it's available. If you can't upgrade to the latest version, update your agents to a version no more than 90 days old. Read more about keeping agents up to date.
See the New Relic Python agent EOL policy for information about agent releases and support dates.