Application Analytics AppDynamics Pro Documentation Version 4.1.4 Page 1
Application Analytics....................................................... 3 Deployment Options and Scenarios........................................ 4 Configuring Application Analytics Data Sources............................... 8 Configuring Log Analytics.............................................. 9 Configuring Transaction Analytics....................................... 18 Using the Application Analytics UI.......................................... 20 Constructing Searches for Application Analytics............................ 24 Creating Metrics for Alerts............................................. 27 Troubleshooting........................................................ 29 Application Analytics Licenses............................................. 30 Event Data for Log Analytics.............................................. 33 Event Data for Browser and Mobile Request Analytics.......................... 34 Event Data for Transaction Analytics........................................ 34 Installing Agent-Side Components......................................... 36 Page 2
Application Analytics On this page: Transaction Analytics Extends AppDynamics APM Search the Application Analytics topics: Watch the video: Analytics Demo AppDynamics Application Analytics enables you to do real-time analysis of business performance that is correlated with the performance of your application software. As a separately licensed product in the AppDynamics Platform, Application Analytics enhances and extends the AppDynamics APM, Browser and Mobile RUM product modules to provide Transaction Analytics, Log Analytics, and Browser and Mobile Analytics ( Beta). Application Analytics can help you answer business-oriented questions such as: How many users experienced failed checkout transactions in the last 24 hours? How much revenue was lost because of these failures? How is that revenue distributed across different product categories? What is your revenue for the day for a geographical region? What was the revenue impact by product category associated with the two marketing campaigns we ran last week? Transaction Analytics provides analytics on data collected by the AppDynamics Java and.net Agents. With Transaction Analytics, you can collect and analyze three kinds of data: Default performance data collected by the app agents about your application's business transactions HTTP-based data collected by HTTP data collectors Custom data collected by method invocation data collectors For details on the data that is collected, see Event Data for Transaction Analytics. To enable Transaction Analytics for your applications, see Installing Agent-Side Components. Log Analytics provides analytics on data collected from any type of log files, including instrumented and non-instrumented applications as well as infrastructure. Log Analytics can be used as a standalone component. You can search and analyze log data just as you do transaction data. Log Analytics works out of the box with the syslog (log4j) format and can be easily configured for other log formats. You can capture other log formats by setting up a regular-expression-based mapping. For details on the data that is collected, see Event Data for Log Analytics. Browser and Mobile Analytics ( beta in 4.1) provides analytics on data collected by the AppDynamics End User Monitoring (EUM) JavaScript Agent and Mobile SDKs. This component is Copyright AppDynamics 2012-2015 Page 3
an add-on to EUM and presents data collected from Browser Real User Monitoring and Mobile Real User Monitoring in a more flexible search format than in the main EUM screens. If you have enabled Browser RUM or Mobile RUM and enabled analytics, you will see these components on the Analytics page. For details on how analytics extends the capability of EUM, see the section "Br owser Analyze versus Browser Request Analytics" in Browser Analyze. For information on using the on-premise EUM server, see Use the On-Premise EUM Server. Transaction Analytics Extends AppDynamics APM Data for every instance of a business transaction is collected by Analytics. Each instance of a business transaction passing through a tier is an event. You can view the raw data for any transact ion instance that occurred within the data retention time. This gives you the ability to analyze the business impact of every event and learn how your customers use your application. Collecting Transaction Analytics data requires no change to your application code. You enable analytics on the app agents and Controller you already use. Once you have enabled analytics on your application, you can enable pre-existing data collectors to collect additional business data for every transaction instance and you can add additional data collectors. Deployment Options and Scenarios On this page: SaaS or On-Premise? Agent Side Components Deploying Analytics Agents to Multiple Nodes Server Side Components for On-Premise Deployment Options Summary Download the Software Application Analytics is built on top of the AppDynamics platform, including the Events Service, the unstructured document store for the platform. The exact components you need to install and the steps for doing so depend upon your environment and your requirements. This section outlines some of those considerations and scenarios. SaaS or On-Premise? The first deployment question you need to consider is whether to use SaaS or On-Premise AppDynamics Analytics. With SaaS, AppDynamics stores the data and hosts the server components of the system for you. In the on-premise approach, you host the components yourself, storing all data on-premises. These differences are: For SaaS, you need to install only the application agent components of the system, as described in Agent Side Components. Copyright AppDynamics 2012-2015 Page 4
For On-Premise, you need to install the Agent Side Components and the Controller and Events Service components described in Install the Controller and Install the Events Service. The on-premise deployment involves additional setup and administration, but it enables you to retain all analytics data within your own data center. Agent Side Components Whether you are using SaaS or On-Premise Application Analytics, you need to deploy (or enable) components that provide Analytics functionality to the parts of the system that reside in your application environment. These appear on the left side of the diagram. As numbered, the agent components are: AppDynamics App Agent: Application Analytics relies upon the same app agents that AppDynamics APM uses. If you already use AppDynamics APM, you likely already have these deployed to your monitored applications. Analytics Agent: The Analytics Agent collects data from one or more app agents and sends it to the Events Service. It also reads and transmits log data from log files from the local machine. For Java installations, the Analytics agent is provided as an extension to the Standalone Machine Agent and is embedded in the Standalone Machine Agent binary distribution. For.NET, you need to download and install the analytics_agent.zip file, a separate Java application that collects transaction and log data from.net applications running on Windows OS. For SaaS-based installations, the connection between the Agent and the Events Service in the cloud takes place over ports 80 (HTTP) and 443 (HTTPS). For on-premise installations, the port on which the Events Service receives data from the Agent is configured during installation. Analytics Plugin (AP) to the App Agent: The Analytics Plugin extends App Agent functionality so that it can collect and forward data to the Analytics Agent. It is built into the 4.0 version of the Java and.net App Agents, but is not enabled by default. No additional download is needed here. Copyright AppDynamics 2012-2015 Page 5
Deploying Analytics Agents to Multiple Nodes A real world scenario is unlikely to consist of a single monitored node, as shown in the diagram above. It usually consists of many applications deployed over many hosts. While the APM app agents continue to send data to the Controller in the normal way, the Analytics Plugin, sends its data to the Analytics Agent component. This component runs in a separate JVM process in the local environment or network, either as part of the Standalone Machine Agent or on its own. There must be at least one Analytics Agent in the monitored environment, although multiple agent plugins collecting only transaction data can share a single Analytics Agent, as shown in the figure. However, each machine where you want to gather log data must have its own Standalone Machine Agent with the Analytics Agent enabled. In this example, AppServerHost1-3 are collecting only transaction data, so only app agents are required. Each one connects to the Analytics Agent on AppServerHost4, where log information is Copyright AppDynamics 2012-2015 Page 6
also being collected. The Analytics Agents could also reside on the same machines as each app agents. On AppServerHost5, only log data is being collected, so an Analytics Agent is also necessary there. Server Side Components for On-Premise To set up AppDynamics Analytics on-premise, you also need to install the server parts of the system, the AppDynamics Controller and the Events Service. The following figure depicts the basic components of an On-Premise deployment: The server components add the following to the deployment: AppDynamics Controller: The heart of an AppDynamics deployment, the Controller processes and presents the information gathered by the agents. Events Service: The unstructured document store, it gathers and stores data from the Analytics Agents and, if you have End User Monitoring, from the Controller UI to run queries on that data. EUM Server. It allows the If you are using SaaS EUM, you must use the SaaS Events Service. If you are using the on-prem EUM Server, you must use an on-premises instance of the Events Service. Deployment Options Summary To summarize the scenarios, when planning your deployment, consider the following: 1. Copyright AppDynamics 2012-2015 Page 7
1. 2. Are you implementing Application Analytics as a SaaS or an on-premise solution? For SaaS, the Events Service is provided as part of the SaaS service. For on-premises, the Controller and the Events Service must be installed on-premises as well. For each monitored machine, are you capturing log data, transaction data, or both? For log data only, install either the Standalone Machine Agent (Java) or the Analytics Agent (.NET) on the host and enable the App Agent Plugin. For log data and transaction data, install either the Standalone Machine Agent (Java) or the Analytics Agent (.NET) on the host and enable the App Agent Plugin. For transaction data only, install either the Standalone Machine Agent (Java) or the Analytics Agent (.NET) on each host or use a common Analytics Agent instance that is shared. Enable the App Agent Plugin on each host. The following topics describe how to install the components: Installing Agent Side Components: You need to perform these steps whether you are using the on-premise or the SaaS form of the Application Analytics product. Install the Controller and Install the Events Service: You only need to perform these steps if you want your installation to be fully on-prem. Download the Software For SaaS Deployments, the agent-side distribution archives for AppDynamics Application Analytics are available on the AppDynamics SaaS Download Zone. For On-Premise Deployments, please reach out to your AppDynamics Account Manager. Configuring Application Analytics Data Sources Once you have installed the components and enabled Application Analytics, you need to configure the system to collect the data in which you are interested. If you aren't there, go to the Configuration page: 1. In the Controller UI, use the context menu on the top left to select Analytics. 2. Select Configuration. You configure Transaction Analytics by enabling analytics for specific applications and business transactions. You can also enable analytics for existing data collectors or create new ones specifically for your analytics needs. You configure Log Analytics by creating a job file. The job file specifies the location of the source log file, the pattern for structuring the records in the log, and other options for capturing records from the log source. Browser Request Analytics and Mobile Request Analytics, an alternate view of EUM data, is in beta for 4.1. Data sources in the beta require no configuration. If you are using an on-premise Events Service, however, and an on-premise EUM Server, you do need to configure the Server to send its data to the Events Service. See Install and Configure the On-Premise EUM Server. Beta Disclaimer This documentation mentions features that are currently beta. AppDynamics reserves the Copyright AppDynamics 2012-2015 Page 8
right to change the features at any time before making them generally available as well as never making them generally available. Any buying decisions should be made based on features that are currently generally available. Configuring Log Analytics 1. 2. 3. On this page: Set Up Log Analytics Support for Numeric Fields (new in 4.1.3) Troubleshoot Logs Troubleshoot Patterns Related pages: Event Data for Log Analytics To capture and present log records as analytics data, you must configure one or more log sources for the Analytics Agent. Once set up, the log source is used by the Analytics Agent to import records from the log file, structure the records according to your configuration, and send the data to the Analytics Processor. From there, the Controller presents the data in the Application Analytics UI. Make sure you have installed and configured the components described in Installing Agent-Side Components and, for on-premise, Install the Controller and Install the Events Service. before attempting to configure Log Analytics. Set Up Log Analytics The general steps to configure log analytics are: Describe the Log Source in a Job File Reuse or Create New Grok Expressions Verify Analytics Agent Properties Describe the Log Source in a Job File Each log source is represented by a job file. A job file is a configuration file that specifies the location of the source log file, the pattern for structuring the records in the log, and other options for capturing records from the log source. To define a source, you create a job file (or modify one of the samples) in the Analytics Agent configuration directory. The Analytics Agent include sample job files for Glassfish, OSX log, and others. The job files are located in the following directory: <Analytics_Agent_Home>/conf/job/ The agent reads the job files in the directory dynamically, so you can add job files in the directory without restarting the agent. To configure a job file, use the following configurable settings in the file: Copyright AppDynamics 2012-2015 Page 9
Enabled: Determines whether this log source is active. Set the value of this setting to true to have the Analytics Agent attempt to capture records from this log source. File: The location and name of the log file to serve as a log source. This must be on the local machine. If you are using a wildcard for the filename, you must surround it with quotes. For example, nameglob: "*.log". Multiline: For log file formats that may include log records that span multiple lines, configure the multiline setting and indicate how the individual records in the log file can be identified. A typical example of a multiline log record is one that includes a Java exception. If the log source includes multiline records, use either of the following settings to identify the lines that comprise continuation lines in a multi-line log record: startswith: A simple prefix that matches the start of the continuation lines for a multiline log record. regex: A regular expression that matches any identifying feature in the continuation lines of a multiline log record. Note: If the particular format of a multi-line log file does not permit reliable continuation line matching by regular expression, you may choose to use a single line format. For most types of logs, this would result in the capture of the majority of log records. Fields: The fields are used to specify the context of the log data in the Controller UI, by application name, tier name, and so on. Specify the fields as free form, key-value pairs. grok: The grok field specifies the patterns by which the data in the unstructured log record is mapped to structured analytics fields. It associates a named grok expression (as defined in a.grok file in the <Analytics_Agent_home>/conf/grok directory) to a field in the data as structured by the agent. For example: grok: patterns: - "\\[%{LOGLEVEL:logLevel}%{SPACE}\\] \\[%{DATA:threadName}\\] \\[%{JAVACLASS:class}\\] %{GREEDYDATA:logMessage}" - "pattern 2"... In this case, the grok-pattern name LOGLEVEL is matched to an analytics data field named loglevel. The regular expression that is specified by the name LOGLEVEL is defined in the file grok-patterns.grok in the grok directory. For more about Grok expressions, see R euse or Create Grok Expressions. Previous versions of Log Analytics used a single "pattern" rather than a pattern list. This mode is still supported for backwards compatibility. eventtimestamp: This setting defines the pattern for the timestamp associated with captured data. Create Extracted Fields on the Fly, using the Log Analytics Controller UI Creating complex patterns can be tricky. Sometimes it's easier to select a sample job file that may successfully map some of the information you have in your source log files, and then use an Copyright AppDynamics 2012-2015 Page 10
interactive display to explore additional patterns in real time. In these situations, you can set the E nabled flag and the File log location in a sample job file to begin collecting your logs, and then use the UI to dynamically fine-tune your fields. This process is called creating Extracted Fields. 1. In the Log Analytics Controller UI, click Create New Fields. 2. The Select Source Type popup appears. Use the dropdown menu to select the log file source - based on the job file you used - with which you want to work. 3. A timestamped list of log entries appears. Copyright AppDynamics 2012-2015 Page 11
3. 4. Use the counters at the bottom of the page to move through the list, as needed. Click Next. The Create New Fields popup appears. 5. Click Add Field to try a pattern. Enter the pattern, using Java-based regular expressions, in Copyright AppDynamics 2012-2015 Page 12
5. 6. 7. 8. 9. the Pattern field. The pattern includes both the name you wish for the field (in the screenshot, Field1) and the regex for the value. Grok-based patterns are not supported. Click Apply. The result of the pattern is highlighted in the New Fields Preview panel below and a column with that name is populated with the discovered value. To create multiple fields, repeat the process, beginning with Add Field. When you are satisfied with the results, click Save. Each Analytics Agent in your installation periodically syncs its definitions with those created in the Controller. Reuse or Create Grok Expressions Grok is a way to define and use complex, nested regular expressions in an easy to read and use format. Regexes defining discrete elements in a log file are mapped to grok-pattern names, which can also be used to create more complex patterns. Grok-pattern names for many of the common types of data found in logs are already created for you. A list of basic grok-pattern names and their underlying structures can be seen here: <Analytics_Agent_Home>/conf/grok/grok-patterns.grok The grok directory also contains samples of more complex definitions customized for various common log types - java.grok, mongodb.grok, etc. Additional grok patterns can be seen here: https://grokdebug.herokuapp.com/patterns# Once the grok-pattern names are created, they are then associated in the jobs file with field identifiers that become the analytics keys. The basic building block is %{grok-pattern name:i dentifier}, where grok-pattern name is the grok pattern that knows about the type of data in the log you want to fetch (based on a regex definition) and identifier is your identifier for the kind of data, which becomes the analytics key. So %{IP:client} would select an IP address in the log record and map it to the key client. Custom grok patterns Complex grok patterns can be created using nested basic patterns. For example, from the mongo db.grok file: MONGO_LOG %{SYSLOGTIMESTAMP:timestamp} \[%{WORD:component}\] %{GREEDYDATA:message} It is also possible to create entirely new patterns using regular expressions. For example, the following line from java.grok defines a grok pattern named JAVACLASS. JAVACLASS (?:[a-za-z$_][a-za-z$_0-9]*\.)*[a-za-z$_][a-za-z$_0-9]* Because JAVACLASS is defined in a. grok file in the grok directory it can be used as if it were a basic grok pattern. In a jobs file, you can use the JAVACLASS pattern match as follows: Copyright AppDynamics 2012-2015 Page 13
grok: pattern: "... \[%{JAVACLASS:class}\\] In this case, the field name as it appears in the Application Analytics UI would be "class". For a full example, see the following files: Job file: <Analytics_Agent_Home>/conf/job/sample-analytics-log.job Grok file: <Analytics_Agent_Home>/conf/grok/java.grok Support for Numeric Fields ( new in 4.1.3) In Release 4.1.3, the grok definition syntax has been enhanced to support three basic data types. When defining a pattern in the.grok file you can specify the data type as number, boolean, or string. If a Grok alias uses that grok definition in a.job file then the extracted field is stored as a number or boolean. Strings are the default. If the number or boolean conversion fails, then a log message appears in the agent's log file. No validations are performed as it is not possible to reverse engineer a regex reliably. These are pure runtime extractions and conversions. Upgrade pre-4.1.3 Job Files For 4.1.2 (or older).job files in use that have fields that are unspecified or specified as NUMBER and now switch to the ""type aware" files, the data inside Events Service will break. This is due to the type mapping. To avoid this, you need to modify the grok alias in your job files. Examples: Was: grok: patterns: - "%{DATE:happenedAt},%{NUMBER:quantity} Update job to: grok: patterns: - "%{DATE:happenedAt},%{NUMBER:quantity_new} Was: grok: patterns: - "%{DATE:happenedAt},%{DATA:howMany} Update job to: grok: patterns: - "%{DATE:happenedAt},%{POSINT:howManyInt} To Upgrade (migrate) pre-4.1.3 job files: 1. Stop analytics-agent. 2. Copyright AppDynamics 2012-2015 Page 14
2. Change.job files that use the enhanced grok patterns: BOOL:boolean INT:number BASE10NUM:number NUMBER:number POSINT:number NONNEGINT:number Change the grok alias so as not to conflict with the older aliases: grok: patterns: (Old) - "%{DATE:quoteDate},%{NUMBER:open},%{NUMBER:high},%{NUMBER:low}, %{NUMBER:close},%{NUMBER:volume},%{NUMBER:adjClose}" (New aliases) - "%{DATE:quoteDate},%{NUMBER:openNum},%{NUMBER:highNum},%{NUMBER :lownum},%{number:closenum},%{number:volumenum},%{number:adjclo senum}" 3. Start analytics-agent. Verify Analytics Agent Properties In addition to configuring the log source in the job file as described above, you should verify the settings in the analytics-agent.properties file in the conf directory. In the file: http.event.endpoint should be the location of the Events Service. The http.event.accountname and http.event.accesskey settings should be set to the name and the key of the account in the Controller UI with which the logs should be associated. By default, they are set to the built-in account for a single tenancy Controller. The pipeline.poll.dir setting specifies where the log configuration files are located. This would not normally be changed, unless you want to keep your files in a different location. Troubleshoot Logs If log capture is working correctly, logs should start appearing in the Log tab in the Analytics UI. It can take some time for logs to start accumulating. Note the following troubleshooting points: If nothing appears in the log view, try searching over the past 24 hours. Timezone discrepancies between the logs and local machine can cause log entries to be incorrectly excluded based on the selected timeframe in the Controller UI. To remediate, try setting the log files and system time to UTC or logging the timezone with the log message to verify. Copyright AppDynamics 2012-2015 Page 15
An inherent delay in indexing may result in the "last minute" view in the UI consistently yielding no logs. Increase the time range if you encounter this issue. Troubleshoot Patterns To help you troubleshoot the data extraction patterns in your job file, you can use the two debug REST endpoints in the Analytics Agent: http://<analytics_agent_host>:<analytics_agent_http_port>/debug/grok: For testing grok patterns http://<analytics_agent_host>:<analytics_agent_http_port>/debug/timestamp: For testing timestamp patterns In the following examples, the Analytics Agent host is assumed to be localhost and the Analytics Agent port is assumed to be 9090. To configure the port on your Agent, use the property ad.dw.http.port in <Analytics_Agent_Home>/conf/analytics-agent.prop erties. The Grok Endpoint Click here to expand... The Grok tool works in two modes: extraction from a single line log and extraction from a multi-line log. To get a description of usage options: curl -X GET http://localhost:9090/debug/grok Single Line In this mode you pass in (as a POST request) a sample line from your log and the grok pattern you are testing, and you receive back the data you passed in organized as key/value pairs, where the keys are your identifiers. curl -X POST http://localhost:9090/debug/grok --data-urlencode "logline=log_line" --data-urlencode "pattern=pattern" For example, the input: curl -X POST http://localhost:9090/debug/grok --data-urlencode "logline=[2014-09-04t15:22:41,594z] [INFO ] [main] [o.e.j.server.handler.contexthandler] Started i.d.j.mutableservletcontexthandler@2b3b527{/,null,available}" --data-urlencode "pattern=\\[%{loglevel:loglevel}%{space}\\] \\[%{DATA:threadName}\\] \\[%{JAVACLASS:class}\\] %{GREEDYDATA:logMessage}" would produce this output: Copyright AppDynamics 2012-2015 Page 16
{ threadname => main loglevel => INFO class => o.e.j.server.handler.contexthandler logmessage => Started i.d.j.mutableservletcontexthandler@2b3b527{/,null,available} } The input: curl -X POST http://localhost:9090/debug/grok --data-urlencode "logline=2010-05-05,500.98,515.72,500.47,509.76,4566900,509.76" --data-urlencode "pattern=%{date:quotedate},%{data:open},%{data:high},%{data:low},% {DATA:close},%{DATA:volume},%{GREEDYDATA:adjClose}" would produce this output: { open => 500.98 adjclose => 509.76 volume => 4566900 quotedate => 10-05-05 high => 515.72 low => 500.47 close => 509.76 } Multi-line The multi-line version uses a file stored on the local filesystem as the source input. curl -X POST http://localhost:9090/debug/grok --data-urlencode "logline=`cat FILE_NAME`" --data-urlencode "pattern=pattern" where FILE_NAME is the full path filename of the file that contains the multi-line log. The Timestamp Endpoint Click here to expand... The timestamp tool extracts the timestamp from a log line. To get a description of usage options: Copyright AppDynamics 2012-2015 Page 17
curl -X GET http://localhost:9090/debug/timestamp In this mode you pass in (as a POST request) a sample line from your log and the timestamp pattern you are testing, and you receive back the timestamp contained within the log line. curl -X POST http://localhost:9090/debug/timestamp --data-urlencode "logline=log_line" --data-urlencode "pattern=pattern" For example, the input: curl -X POST http://localhost:9090/debug/timestamp --data-urlencode "logline=[2014-09-04t15:22:41,237z] [INFO ] [main] [io.dropwizard.server.serverfactory] Starting DemoMain" --data-urlencode "pattern=yyyy-mm-dd't'hh:mm:ss,sssz" would produce this output: { eventtimestamp => 2014-09-04T15:22:41.237Z } The input: curl -X POST http://localhost:9090/debug/timestamp --data-urlencode "logline=nov 17, 2014 8:21:51 AM com.foo.blitz.processor.core.hbase.coprocessor.endpoint.timerollup ProcessEndpoint$HBaseDataFetcher callfoo1" --data-urlencode "pattern=mmm d, yyyy h:mm:ss aa" would produce this output: { eventtimestamp => 2014-11-17T16:21:51.000Z } Configuring Transaction Analytics On this page: Select the Application and the Business Transactions Copyright AppDynamics 2012-2015 Page 18
Configure Data Collectors Works with: Related pages: Configuring Application Analytics Data Sources Event Data for Transaction Analytics Make sure you have installed and configured the components described in Installing Agent-Side Components and, for on-premise, Install the Controller and Install the Events Service b efore attempting to configure Transaction Analytics. Select the Application and the Business Transactions Configuring Transaction Analytics consists of selecting the application and the specific business transactions that you want to analyze. You can also enable the collection of additional business data using Data Collectors. To enable and configure Transaction Analytics: 1. 2. 3. In the Controller UI, from the Home page, select Analytics > Configuration. Select the application. Specify the business transactions to report analytics data: Even if you have pre-existing Data Collectors defined, you will not get Analytics data unless you make sure that the appropriate business transactions are enabled for analytics. Configure Data Collectors This step is optional. For convenience, the Configure Transaction Analytics page includes sections for configuring data collectors (both HTTP and method invocation data collectors). HTTP Data Collectors To collect HTTP request data, you can use the default HTTP Request Data Collector. You need to explicitly enable the collector for Analytics. Copyright AppDynamics 2012-2015 Page 19
To configure HTTP Data Collectors for Analytics, use these steps: 1. From the Transaction Analytics Configuration page, select the HTTP Data Collector to enable and click Edit (or Add to create new collectors). In the pop-up screen, confirm the data to collect and confirm Transaction Analytics is checked. 2. Click Configure Transactions Using this Data Collector and confirm that the data collector is enabled on the appropriate Business Transactions. See Collecting Application Data for more information. Method Invocation Data Collectors You can also use the Analytics -> Configuration screen to enable existing or new Method Invocation Data Collectors. Open the Method Invocation Data Collectors panel. The process is essentially the same as described in Collecting Application Data. Make sure that you: Check the Transaction Analytics check box to use this collector for Application Analytics Check the Configure Transactions popup to confirm you have enabled the right Business Transactions. Using the Application Analytics UI On this page: Identify the Data to Analyze Visualize the Data on Analytics Dashboard This section provides an overview of how to use the Application Analytics UI. There are two steps: Identify the data you want to analyze Visualize the data Identify the Data to Analyze After the Events Service is installed (for on-premise) and Analytics is enabled on the Controller, whether locally by you or in a SaaS environment by AppDynamics, you can navigate to the Copyright AppDynamics 2012-2015 Page 20
Analytics interface from the Controller UI. Navigate to the Analytics interface in the Controller UI by clicking Analytics in the top navigation bar. From the Search & Analyze page, use the dropdown menu at the top of the page to select the data type you want to work with. You see the analytics data types licensed and enabled for your application. Exploring Analytics Data All four data types share the same basic layout on the Data tab of the Share and Analyze page. Use the Data tab to select the data to review and analyze. Use the Visualization tab to see graphic representations based on that data. First identify your data set. This is an annotated sample of the Data tab for Log Analytics. Copyright AppDynamics 2012-2015 Page 21
Strategies for Locating Data of Interest Use a saved search by clicking the dropdown in the upper left corner Use the vertical ellipisis icon (three-dot menu) to manage named searches (saving, duplicating, and so on) Create searches for specific use cases. For details, see Constructing Searches for Application Analytics. Focus on a specific time range by dragging your mouse across the event stream or by using the time range dropdown Select various fields from the data, targeting the kind of data of interest to you, then scan the event list: Double-click on any specific event to display detailed information Examine Top 10 Values. Click on a field to see the the top 10 values of that field in your filtered data. The results are presented as a count and percentage from all data within the specified time range for that field. This can help you get immediate insights from your data without having any predefined rules or previous knowledge of the data. You can add a value to the search criteria bar by hovering over the value and clicking on the plus icon that appears to the right. Relevant Fields ( Beta feature, available only for Transaction and Log Analytics) : Once you have added at least one filter, you can use Relevant Fields to find data fields that show a high Relevance Score. A high relevance score indicates these fields are significantly more common in your search than in the entire data set and thus these fields may be particularly useful to investigate. Click a field to add it as an additional filter. In this example, the Inventory-Services Tier has a high Relevance Score when a filter for the Error User Copyright AppDynamics 2012-2015 Page 22
Experience is used. Create New Fields (Log Analytics) : Use this button to create a new field extraction definition for the log source type dynamically. See Configuring Log Analytics for more information on field extraction definitions. You must define the field manually using Java regex patterns - Grok patterns are not supported. Visualize the Data on Analytics Dashboard Once you have defined your preferred data set, use the Visualization tab to explore the specific aspects of the data that interest you and to drill down into the relationships you need to understand. Each visualization type is a widget, and widgets can be added and removed as desired. To add a new visualization widget to an existing dashboard: 1. Click the Widget Builder icon. Copyright AppDynamics 2012-2015 Page 23
1. The Widget Builder appears. 2. 3. 4. Drag and drop Fields to define the relationship you want to investigate. Use the Visualize section to select the display type you want to use. To make this widget a part of your permanent Dashboard, click Save & Back to Dashboard. Constructing Searches for Application Analytics On this page: Search for a Specific Field Value Using a Free Text Search (Log Analytics) There are two way to search your analytics data: Field value search (available for all data types) Free text search (available for Log Analytics) Copyright AppDynamics 2012-2015 Page 24
A field value search enables you to search for a specific value in a field. For Log Analytics, the free text search enables you to search for any keyword or string anywhere in the log. This is useful for fields such as "Message" that have highly variable contents. Search for a Specific Field Value To create a search using a field value: 1. Click Add Search Criteria. The field selector dropdown appears, so you can select the field you want to use: 2. Select a field. The field name appears in the search criteria line: Copyright AppDynamics 2012-2015 Page 25
3. Provide a value for the field by using the value editor. Click the down triangle next to the field name to open the value editor. Common options for these values are described here: Type of entry Exact string match Wildcard string match Regex Other options Description Find all fields where the value is exactly this string. This search is case sensitive. Find all fields where the value is this string plus one additional character <?> or multiple <*>. No wildcard characters can appear in the first three characters of the search term. Find all fields where the value matches this regex pattern. No regex special characters can appear in the first three characters of the search term. Any of the entry types used in free text search. Example glassfish-server-log apache* regex: glassfish.*-log Using a Free Text Search (Log Analytics) To create a search using free text: 1. 2. Enter your search in the free text box. You need to enter all the terms in a single entry. The following describes common options for entries in the free text box: Type of entry Keyword Keyword AND Keyword Description Find log entries with this keyword. This search is case insensitive. Find log entries containing both keywords. Example Error ERROR NullPointerException String Find log entries that contain this exact string. "Windows" Copyright AppDynamics 2012-2015 Page 26
String AND Keyword Match with Wildcard Match Not Wildcard Keyword AND/OR Keyword OR Keyword Regex Escape Special Character Find log entries that contain this exact string AND this keyword. Find log entries that contain the expanded wildcard pattern. No wildcard characters can appear in the first three characters of the search term. Find log entries that contain the exact string pattern - wildcard characters treated as characters, (? is the character "question mark") not wildcards. Find log entries that contain one keyword and one or the other of two additional keywords. Find log entries that contain one or the other of two keywords. Find log entries that contain this regex pattern. No regex special characters can appear in the first three characters of the search term. Understand the escaped special characters as the character and not as a special character "Mozilla" INFO com.singularity.*.beans.* "com.singularity.*.beans.*" error AND (failure OR unhealthy) failure OR unhealthy regex: glassfish.*-log search \* request Creating Metrics for Alerts On this page: If you create an analytics search that you want to execute repeatedly and monitor, you can create a metric from that search. You can use the new metric to create alerts using Health Rules to trigger Policies and Actions. (New in 4.1.1) You can create metrics for Browser and Mobile Analytics ( Beta) as well as Transaction and Log Analytics. To create a metric: 1. 2. Set filters to select the appropriate data. Click the three-dot menu and select. Create Metric from Search Query The search will execute once per minute and report the total number of results as a metric. Copyright AppDynamics 2012-2015 Page 27
2. 3. In the popup, give your metric a name and a description. This determines how it appears in the Metrics screen and the Metric Browser. To monitor the metric: 1. 2. Click Metrics in the left navigation bar. The Metrics screen opens up. Edit or Delete metrics from here. If you see a status that says "Disable due to repetitive failures", you can re-enable it by clicking Edit and checking Enabled. 3. Copyright AppDynamics 2012-2015 Page 28
3. To see the metric in the Metric Browser, click Metric Browser. Troubleshooting For general troubleshooting, check the logs for errors or warnings. The components write log information as follows: Analytics App Agent Plugin: The plugin writes logs to the same file as the App Agent. The logs are in the following location: <application_home>/<app_agent_home>/logs The primary log file to use for troubleshooting is the file named agent.<timestamp>.log file. Search the file for messages written by the Analytics service The Analytics Agent writes log messages to files in the following directory: <analytics_agent_home>/logs The Analytics Agents records startup errors to: <analytics_agent_home>/startup.log The Events Service writes log messages to files in the following directory: <events_service_home>/logs In particular, the analytics-all.log file can help you with troubleshooting. The Events Service records startup errors to: <events_service_home>/<events_service_name>-startup.log Where <events_service_name> is the process name, like analytics-zookeeper. In addition to checking log files, try this: Make sure that your configuration files are properly configured with the appropriate account name and key. Escape slashes in the account name or key values. If you are having issues with metrics/alerts performance (scheduled queries), try increasing the query batch size, using the analytics.scheduledqueries.batch.size property in the Controller Administration Console. Copyright AppDynamics 2012-2015 Page 29
To access the Console, see Access the Administration Console. Select Controller Settings. The default batch size is 96. Application Analytics Licenses On this page: Application Analytics License Information License Types Overages Updating an On-Prem License You acquire your Application Analytics license from your AppDynamics sales representative. Application Analytics License Information Your Application Analytics license is separate from your Controller license. To view License Information 1. In the upper right section of the Controller UI, click Gear Icon -> License. 2. See the Application Analytics panel. License Types There are two types of licenses associated with Application Analytics: Copyright AppDynamics 2012-2015 Page 30
Transaction Analytics Log Analytics Transaction Analytics Transaction Analytics licenses are based on two units: Volume of data: Measured as a specific number of Business Transaction events. A single license unit is equal to 0.5 million events per day, purchased per year. Retention time SaaS: The default retention period is 30 days. This time period can be increased to 60 or 90 days. On-prem: always 90 days Log Analytics Log Analytics licenses are based on two units: Volume of data: Measured as size of incoming data. A single license unit is equal to 5GB of incoming data per day, purchased per year. Retention time SaaS: The default retention period is 7 days. This time period can be increased to 30, 60 or 90 days. On-prem: always 90 days Browser and Mobile Request Analytics During the 4.1 Beta timeframe, the retention time of the Browser and Mobile Request Analytics data is the same as for EUM, which is two weeks. Overages How overages are handled is determined by the terms of your Analytics license agreement. If your license does not allow overages, AppDynamics stops capturing analytics data after your limit has been reached. If your license does allow overages and your usage exceeds the limit, AppDynamics continues capturing analytics data and bills you for the overage at the unit rate stipulated by your license agreement pro-rated over the volume that exceeded the limit. Updating an On-Prem License If you are updating a previous license, be aware that updated Analytics licenses may not automatically propagate on the Controller across multiple user accounts in an on-prem instance. Updating of these licenses must be managed manually. To update the licenses, open the Administration Console. 1. 2. 3. Select. Accounts Double-click on each account to open. In the top section, adjust the License Date manually. Copyright AppDynamics 2012-2015 Page 31
3. 4. Scroll down to the Application Analytics section, and update the license units provisioned, as necessary. 5. Save. 6. Copyright AppDynamics 2012-2015 Page 32
6. Repeat for all accounts. Event Data for Log Analytics On this page: Log Analytics Data Sample Log Analytics Job Files Related pages: Configuring Log Analytics Data collected from log files depends on the source of the log file and the pattern that you specify for structuring the data in the log. Each log entry is an event in the Log Analytics event stream. Log Analytics Data For example, the following fields are captured for a standard syslog (log4j) format: Field pickuptimestamp Message host source sourcetype nodename (optio nal) tiername (optional ) appname (option al) Timestamp Description The timestamp when the agent picked up the event and sent it to the Analytics Agent The message body of the log event IP address or host name where the event was generated Location of the logs, usually a path or directory such as /tomcat/logs The kind of log file, such as apache-httpserver-access-log Name of the node where the log event occurred Name of the tier where the log event occurred Application name where the log event occurred Timestamp of the log event Copyright AppDynamics 2012-2015 Page 33
Sample Log Analytics Job Files A number of sample job files are shipped with the Analytics Agent including the following: Analytics logs: sample-analytics-log.job Apache access logs: sample-apache-httpserver-access-log.job Apache error logs: sample-apache-httpserver-error-log.job Cassandra logs: sample-cassandra-log.job CouchDB logs: sample-couchdb-log.job Glassfish logs: sample-glassfish-log.job Java Agent logs: sample-java-agent-log.job Jetty error logs: sample-jetty-error-log.job Jetty request logs: sample-jetty-request-log.job Log4J: sample-log4j.job MongoDB logs: sample-mongodb-log.job MySQL error logs: sample-mysql-error-log.job Nginx access logs: sample-nginx-access-log.job Nginx error logs: sample-nginx-error-log.job OS X system logs: sample-osx-system-log.job Postgres logs: sample-postgres-log.job Redis logs: sample-redis-log.job Stock quotes: sample-stock-quotes-csv.job WebLogic logs: sample-weblogic-log.job Event Data for Browser and Mobile Request Analytics Browser RUM Analytics provides details about each browser user request. The data collected includes information about pages, performance metrics, location of your users, browser and device data, errors, and any custom data that you configured in your Browser RUM configuration. For details on the meaning of each metric, see Browser RUM Metrics. Mobile RUM Analytics provides details about the performance of your mobile apps as experienced by your end users. The data collected includes information about the mobile app names and versions, network requests, performance times, locations, carrier and device data, errors, and any custom data that you configured in your Mobile RUM configuration. For details on the meaning of each metric, see Mobile RUM Metrics. Event Data for Transaction Analytics On this page: Default Transaction Data Custom HTTP Request Data Custom Method Data Related pages: Each instance of a business transaction passing through a single tier is an event, so the data associated with a single execution of a distributed business transaction (for example, Checkout) is Copyright AppDynamics 2012-2015 Page 34
stored as a series of events in Analytics. The data consists of the default transaction data and additional data from data collectors configured for that transaction. This data is organized and stored per business transaction. Default Transaction Data The following data is collected by default for each instance of transactions that are enabled for Analytics. Field eventtimestamp application transactionname error node tier requestguid transactiontime requestexperience pickuptimestamp Exit calls Description Time the event occurred in the application Application name Business transaction name Error details Node name Tier name GUID for this specific user request as assigned by AppDynamics Business transaction response time for this request in milliseconds Indicates if the transaction was marked as Normal, Slow, Very Slow, Stall, or Error Timestamp of the first event for this transaction to arrive at the Event Service Details of database and remote service calls. View Details flow map Custom HTTP Request Data When HTTP data collectors are configured, the following information can be collected: Field cookies headers parameters principal Copyright AppDynamics 2012-2015 Page 35
session ID session Objects URL URI path segments: segment0-n Custom Method Data Custom data collected as specified in the method invocation data collector configuration. See Coll ecting Application Data for more details on data collectors. Installing Agent-Side Components On this page: Enable the Analytics Agent Enable Analytics on the Controller In all deployment scenarios, to use AppDynamics Application Analytics, you must enable the Analytics Agent-side components.this section describes the steps to set up each component. As described in Deployment Options and Scenarios, whether you enable one or many Analytics Agents for your deployment depends on your requirements. To collect log data, the machine must have either a Standalone Machine Agent (Java) or the Analytics Agent (.NET) deployed. To collect log and transaction data, the machine must have either a Standalone Machine Agent or the Analytics Agent deployed and Analytics enabled on the controller. To collect only transaction data, the machine must have an App Agent with Analytics enabled on the controller. It must also have access to a Standalone Machine Agent (Java) or the Analytics Agent (.NET) somewhere in the environment. Select the components you need based on your requirements. In all environments, to collect transaction data from an application, you must have deployed an appropriate app agent. If you already use AppDynamics APM, there are probably app agents in your environment. To access the Analytics functionality, the app agent must be version 4.0 or later. See Instrument Java Applications and Instrument.NET Applications for more information. The Java App Agent and the.net App Agent are often simply called the Java Agent and the.net Agent elsewhere in the documentation. "App" is added here only for clarity. Enable the Analytics Agent The Analytics Agent is not enabled by default. To use Application Analytics, you must: Copyright AppDynamics 2012-2015 Page 36
Have a separate Application Analytics license. See Application Analytics Licenses for more information. Enable the Analytics Agent. Enable Analytics on the Controller. Enable the Analytics Agent (Java) The Analytics Agent is implemented as an extension to the Standalone Machine Agent (and runs as a machine agent monitor). In Java environments, use the following steps: 1. 2. On each host where the Standalone Machine Agent is deployed and you want to enable the Analytics Agent, use a text editor to open <machine-agent-home>/monitors/analyt ics-agent/monitor.xml. Set the enabled tag to true as follows, saving the file when you are finished: <monitor> <name>appdynamics Analytics Agent</name> <type>managed</type> <!-- Enabling this requires JRE 7 or higher -->... <enabled>true</enabled> 3. Configure connectivity from the analytics-agent to the Events Service by editing the following file: <machine-agent-home>\monitors\analytics-agent\conf\analytics-ag ent.properties 4. In the analytics-agent.properties file, change the default URL and, if necessary, the port number for the connection to the Events Service by modifying the http.event.endpo int value. For example: http.event.endpoint=http://<events_service_host:events_service_ port> 5. For SaaS-based installations, the host and port are https://analytics.api.appdynamics.com:4 43. For on-premise installations use whatever host and port you have configured. In clustered environments, this is often a load balancer. SaaS installations can also use http, in which case the port would be 80. Configure the account and account key where the agent should publish data as the accountname and accesskey values. For example: Copyright AppDynamics 2012-2015 Page 37
5. # The account in the Controller with which this Analytics data is associated. http.event.accountname=<global_account_name of the format customer1_74678b04-8a71-40ef-acaf-9adb05eeb815> # Replace this value with the access key of the account name configured above. http.event.accesskey=<a_long_key_value such as SJ5b2m7d1$354> The account name is the global account name of the account available on the View License UI of the Controller. The access key provides an authentication mechanism between the Controller and the components of the Application Analytics deployment. The Controller installation process generated the accesskey value. It is also available on the License screen in the Controller. The following screenshot shows you the License UI where you can find the appropriate values. 6. 7. 8. If collecting log information on this host, configure log analytics using one or more of the pipeline templates. For details, see Configuring Log Analytics. Save and close the file. If the machine-agent is already running at this point, it needs to be restarted to pick up the new changes in the configuration. To connect to the Events Service through a proxy server, see Connect the Agent to the Events Service through a Proxy. Enable the Analytics Agent (.NET) For.NET installations, you install the separate Analytics Agent, which is written in Java. As described in Deployment Options and Scenarios, the Analytics Agent can run on the same host or on a different host from the monitored applications, depending on your use case. This procedure installs the Analytics Agent as a Windows service. If you are running the Standalone Machine Agent on the same machine as your.net app agent, it is not necessary to install a separate Analytics Agent. See To enable the Analytics Agent (.NET) Using Standalone Machine Agent. To enable the Analytics Agent for.net Copyright AppDynamics 2012-2015 Page 38
The Windows installer that installs the Analytics Agent as a service is new in 4.1.2. 1. 2. Unzip the Analytics Agent distribution archive to the installation directory on each target host. When you unzip this archive, you get three directories: bin: contains the scripts for Windows(.bat) and Linux (.sh) lib: contains all the jar files that need to be in the class path conf: contains all the configuration files such as the properties and vmoptions files Follow the steps to revise the properties file. Note that analytics-agent.properties file is found in <analytics-agent-home>directory in this case. <analytics-agent-home>\conf\analytics-agent.properties 3. 4. 5. If collecting log information on this host, configure log analytics. For details, see Configuring Log Analytics. Save and close the file. Run the following command to install the Analytics Agent as a service. bin\analytics-agent.bat install-analytics-agent 6. Now your analytics-agent can be managed like any other windows service. Start the service using the method of your choice as described below: To Start and Stop the Analytics Agent (.NET). Revise the Analytics Agent Properties File Follow these steps to revise the properties file. Note the differences in the location of this file depending on your exact deployment scenario. 1. 2. Use a text editor to open the properties file. The analytics-agent.properties file is found either in the <analytics-agent-home> /conf directory or under the <machine-agent-home>/mo nitors/ (in the case where your.net app agent and the Standalone Machine Agent are running on the same machine). Change the default URL and, if necessary, the port number for the connection to the Events Service by modifying the http.event.endpoint value. For example: http.event.endpoint=http://<events_service_host:events_service_ port> 3. Where the host and port are either: https://analytics.api.appdynamics.com:443, for SaaS-based installations, or whatever host and port you have configured for on-premise installations, in clustered environments, often a load balancer. SaaS installations can use also use http if desired, in which case the port would be 80. Configure the account and account key where the agent should publish data as the accountname and accesskey values. For example: Copyright AppDynamics 2012-2015 Page 39
3. # The account in the Controller for this analytics data. http.event.accountname=<global_account_name of the format customer1_74678b04-8a71-40ef-acaf-9adb05eeb815> # Replace this value with the access key of the account name configured above. http.event.accesskey=<a_long_key_value such as SJ5b2m7d1$354> The account name is the global account name of the account available on the View License UI of the Controller. The access key provides an authentication mechanism between the Controller and the various components of the Application Analytics deployment. The accesskey value is generated during the Controller installation process and is also available on the View License UI of the Controller. 4. Change these properties, which are needed to run the analytics agent as a Windows service: ad.dw.log.path=<analytics-agent-home>/logs conf.dir=<analytics-agent-home>/conf To enable the Analytics Agent (.NET) Using Standalone Machine Agent In the case where you are running both the.net App Agent and the Standalone Machine Agent on the same machine, it is not necessary to install a separate Analytics Agent. You can take advantage of the bundled analytics extension to the Machine Agent. Use the following steps: 1. 2. Confirm that you are running the Standalone Machine Agent and the.net app agent, on the same machine. Follow the steps to revise the properties file. Note that analytics-agent.properties file is found under the machine agent install directory in this case. <machine-agent-home>\monitors\analytics-agent\conf\analytics-ag ent.properties 3. 4. 5. Save and close the file. If collecting log information on this host, configure log analytics. For details, see Configuring Log Analytics. Install the Analytics Agent by navigating to the bin directory and running the the following command: Copyright AppDynamics 2012-2015 Page 40
5. bin\analytics-agent.bat install-analytics-agent 6. Now your analytics-agent can be managed like any other windows service. Start the service using the method of your choice as described below: To Start and Stop the Analytics Agent (.NET). To Start and Stop the Analytics Agent (.NET) You can use native windows services menu to start/stop the service or you can do it directly from command line using the following two commands. 1. To start the agent service from the command line: bin\analytics-agent start-service 2. If you need to change any JVM start-up options, use a text editor to modify <analytics -agent-home>\conf\analytics-agent.vmoptions To stop the agent service for the command line: bin\analytics-agent stop-service To Uninstall the Windows Service Run the.bat file with the uninstall command as follows: bin\analytics-agent.bat uninstall-analytics-agent Troubleshooting Tips Make sure that the properties in analytics-agent.properties are properly set. JRE version is >= 1.7 and JAVA_HOME variable is set in the environment. All the properties in analytics-agent/conf/analytics-agent.vmoptions are compatible with the JRE. Verify Agent Start To verify that the Analytics Agent has started, you can look for the following entry in the App Agent log file: "Started [Analytics] collector" If you need to connect to the Events Service through a proxy server, see Connect the Agent to the Events Service through a Proxy below. Enable Analytics on the Controller Copyright AppDynamics 2012-2015 Page 41
Once you have set up the Analytics Agent, you need to enable Analytics on the Controller. 1. 2. 3. 4. In the Controller UI, select the Analytics tab. Select Configuration. Select the Application you wish to monitor from the dropdown menu. Check Enable. Configuring the Analytics Agent on a Different Host The default settings assume that the Analytics Agent is on the same host and uses the same default port as your App Agent. If your Analytics Agent is on a host separate from the monitored application, or you have changed the default port, you need to specify the new host and port values. For Java Specify the location of the remote Analytics Agent with a -D parameter to the JVM, as follows: -Dappdynamics.analytics.agent.url=http://<analytics-agent-ip>:9090/v 1/sinks/bt Replace <analytics-agent-ip> for your environment. with the hostname of the Analytics Agent For.NET Specify the location of the remote Analytics Agent with a.net environment variable as follows: Add a system environment variable named "appdynamics.analytics.agent.url" and set the value to http:// <analytics-agent-ip> :9090/v1/sinks/bt. Replace <analytics-agent-ip> with the hostname of the Analytics Agent for your environment. Reboot the machine to ensure that this takes effect. Although you can also use a User environment variable, the user under which the environment variable is set must have the same permissions as the user under which all the instrumented apps are running. To avoid issues with this, we recommend using a System environment variable approach. While it is not required that you restart your machine, the parent process which is invoking the monitored process must have the env var set. To ensure this, the easiest procedure is to restart the machine. To add an environment variable, go to System Properties > Advanced system settings > Environment Variables. Copyright AppDynamics 2012-2015 Page 42
Add a new environment variable as shown: Copyright AppDynamics 2012-2015 Page 43
Connect the Agent to the Events Service through a Proxy (Optional) In some environments you need to connect to the Events Service through a proxy server. a. b. Java: Open <machine-agent-home>\monitors\analytics-agent\conf\ana lytics-agent.properties with a text editor..net: Open <analytics-agent-home>\conf\analytics-agent.properties with a text editor. Add this information: # optional proxy properties http.event.proxyhost=<your proxy host> http.event.proxyport=<your proxy port> http.event.proxyusername=<your proxy username, if authentication is required> http.event.proxypassword=<your proxy password, if authentication is required> Copyright AppDynamics 2012-2015 Page 44