analyzer(1) 맨 페이지 - 윈디하나의 솔라나라

개요

섹션
맨 페이지 이름
검색(S)

analyzer(1)

analyzer(1)                      User Commands                     analyzer(1)



NAME
       analyzer  -  Graphical tool for analyzing a program performance experi‐
       ment

SYNOPSIS
       analyzer [-j|--jdkhome jvm-path] [-J jvm-options]
            [-f|--fontsize size] [-u|--userdir dir_path]
            [-v|--verbose] [experiment-name]


       analyzer [-j|--jdkhome jvm-path] [-J jvm-options]
            [-f|--fontsize size] [-u|--userdir  dir_path]
            [-v|--verbose]  -c [base-group] [compare-group]


       analyzer -V|-version


       analyzer -?|-h|--help


       analyzer [-f|--fontsize size] [-u|--userdir dir_path]
            [-v|--verbose] target [target-arguments]

DESCRIPTION
       The analyzer command starts Performance Analyzer, which is a Java-based
       graphical  data-analysis  tool  that you can use to analyze performance
       data that is collected from target programs. The data is  collected  by
       the Collector using the Performance Analyzer's Profile Application dia‐
       log box, the collect command, or the collector  commands  in  dbx.  The
       code coverage tool uncover can also invoke the Collector.


       The  Collector  gathers performance information to create an experiment
       during the execution of a process. Performance Analyzer reads  in  such
       experiments,  analyzes  the  data, and displays the data in tabular and
       graphical displays. A command-line version of Performance  Analyzer  is
       available as the er_print utility.


       When  Performance  Analyzer  is  invoked on more than one experiment or
       experiment group, it aggregates data  from  the  experiments.  You  can
       change  the  mode to compare experiments. For more information, see the
       Comparing Experiments section below.

OPTIONS
       -j|--jdkhome jvmpath

           Specify the path to the Java virtual  machine  (JVM)  software  for
           running  Performance  Analyzer.  The default path is taken first by
           examining environment variables for a path to the JVM, in the order
           JDK_HOME,  and  then  JAVA_PATH. If neither environment variable is
           set, the version found on your PATH is  used.  If  none  is  found,
           /usr/java/bin/java is tried.


       -Jjvm-option

           Specify  JVM  software  options.  Multiple -J arguments can be sup‐
           plied. Note that there is no space between the -J flag and the jvm-
           option. Examples:


             analyzer -J-d64 -- run the 64-bit analyzer (valid for Java 7 only)
             analyzer -J-Xmx2G -- run with maximum JVM memory of 2 GB
                       (Default, 1 GB)
             analyzer -J-d64 -J-Xmx8G -- run the 64-bit analyzer with maximum
                       JVM memory of 8 GB (-J-d64 valid for Java 7 only)




       -c base-group compare-group

           Specify compare mode. When specified, two experiments or experiment
           groups must be provided as arguments.  Performance  Analyzer  opens
           the  experiment  or  experiment  groups in comparison mode with the
           first experiment or group as the base group and the second  as  the
           comparison group.

           Each group can be a single experiment, or a group containing multi‐
           ple experiments. If you want to include more than one experiment in
           a compare group, you must create an experiment-group file to use as
           a single argument to analyzer.


       -f | --fontsize size

           Specify the font size to be used in Performance Analyzer.


       -u | --userdir dir_path

           Specify the path to the user directory in which to store user  set‐
           tings.


       -v | --verbose

           Print  version information and Java runtime arguments before start‐
           ing.


       -V | --version

           Print version information and exit.


       -? | -h| --help

           Print usage information and exit.


OPERANDS
       experiment-name

           Specify an experiment to open. To specify more than one experiment,
           provide  a  space-separated  list  of  experiment names, experiment
           group names, or both.


       target[target-arguments]

           Specify a target program to be profiled along  with  any  arguments
           for the program.


USAGE
       To  start  Performance Analyzer, type the following command on the com‐
       mand line:

         analyzer [experiment-name]



       If you do not specify experiment names when invoking  Performance  Ana‐
       lyzer, the initial view is the Welcome screen. You can open experiments
       from the Welcome screen or by using Performance  Analyzer's  File  menu
       and  toolbar  buttons  to  open,  compare,  or aggregate experiments or
       experiment groups.


       The optional experiment-name command argument is the name of an experi‐
       ment,  or  a space-separated list of experiment names, experiment group
       names, or both. Experiments recorded on any supported architecture, can
       be  displayed  by Performance Analyzer running on the same or any other
       supported architecture.


       Multiple experiments or experiment groups can be specified on the  com‐
       mand line. For information about experiment groups, see Creating Exper‐
       iment
                           Groups section below.


       If you specify an experiment that has descendant experiments inside it,
       all  descendant  experiments are automatically loaded and their data is
       displayed.


       You can preview an experiment or experiment group for loading  by  sin‐
       gle-clicking on its name in the Open Experiment dialog box.

   Starting Performance Analyzer to Profile an Application
       You  can  start Performance Analyzer to profile an application from the
       command line as follows:

         analyzer target [target-arguments]



       Performance Analyzer starts up with the Profile Application dialog  box
       showing the named target and its arguments, and settings for collecting
       an experiment. For more information, see Recording Experiments below.

   Creating Experiment Groups
       To create an experiment group, create a plain  text  file  whose  first
       line is as follows:

         #analyzer experiment group



       Then  add  the  names  of the experiments on subsequent lines. The file
       extension of the experiment group text file must be .erg. You  can  use
       experiment  groups also to load only specific descendant experiments if
       you want to isolate their data away from their founder experiment.

   Comparing Experiments
       When you invoke Performance Analyzer on more  than  one  experiment  or
       experiment  group, it normally aggregates the data from all the experi‐
       ments. If you invoke Performance Analyzer with the -c flag, Performance
       Analyzer opens experiments in comparison mode.


       You can also compare experiments from the Welcome screen, from the File
       menu, or from the Compare status area on the status bar.


       In comparison mode, the Functions view shows separate columns  of  met‐
       rics for each experiment or group to enable you to compare the data.


       Comparison  style can be set from the Compare panel in Performance Ana‐
       lyzer or in the Settings dialog's Formats tab. By  default,  experiment
       comparison  shows  the  absolute values of the base experiment and com‐
       pared experiments. You can specify comparison style  "Deltas"  to  show
       the  metrics  for the compared experiment as a + or - value relative to
       the base experiment. You can specify comparison style "Ratios" to  show
       the compared experiment metrics as a ratio relative to the base experi‐
       ment. Your selection is saved in your configuration  settings  for  the
       next time you compare experiments.


       Comparing experiments works in most data views except Call Tree, Races,
       Deadlocks, Heap, and I/O Activity.


       The Source and Disassembly Views show a dual pane when comparing exper‐
       iments.  The  Timeline  shows separated data from the experiments being
       compared. The Selection Details window shows data from the base experi‐
       ment only.

   Recording Experiments
       When  Performance  Analyzer  is  invoked  with a target name and target
       arguments, it starts with the  Profile  Application  dialog  box  open,
       allowing you to record an experiment on the named target when you click
       Run in the dialog. You can also record a  new  experiment  by  clicking
       Profile Application in the Welcome page, or clicking the Profile Appli‐
       cation button on the toolbar, or choosing Profile Application from  the
       File menu.


       Press F1 in the Profile Application dialog box to see the Help for more
       information.


       Note that the Profile  Application  dialog  fields  correspond  to  the
       options  available  in  the  collect  command, as described in the col‐
       lect(1) man page.


       The Preview Command button at the bottom of the dialog box enables  you
       to  see  the  collect command that would be used when you click the Run
       button.

   Setting Configuration Options
       Performance Analyzer reads settings from a configuration file.


       You can control the configuration settings  from  the  Settings  dialog
       box. To open this dialog box, click on the Settings button in the tool‐
       bar or choose Settings from the Tools menu.  The  Settings  dialog  box
       enables you to specify settings such as metrics and default data views.


       You  must  click  OK or Apply to apply your changes to the current ses‐
       sion. Your settings are automatically saved to a configuration file  in
       the  experiment when you exit Performance Analyzer. The Settings dialog
       box has an Export button that you can use to save some or  all  of  the
       settings  in  your configuration to other locations so you can share it
       with other experiments or users. When you open an experiment  from  the
       Open Experiment dialog, Performance Analyzer searches default locations
       for available setting configurations, and offers you the choices in the
       dialog. You can also import settings.


       You  can  choose  Tools  >  Export Settings to .er.rc file, in order to
       apply relevant settings to  the  .er.rc  file  which  is  used  by  the
       er_print utility.


       A  few of the possible directives are processed by Performance Analyzer
       from .er.rc files, including en_desc {on| off}. to control  whether  or
       not  descendant  experiments  are  selected  and  read when the founder
       experiment is read. Most other directives are  ignored  by  Performance
       Analyzer, although they are processed by er_print.


       The files also specify a path for C++ name demangling for other compil‐
       ers.


       When loading experiments with many descendants, you can set the SP_ANA‐
       LYZER_DISCARD_TINY_EXPERIMENTS  environment variable to control whether
       to ignore processes that had very short profile  durations  or  had  no
       profiling data. The following values are allowed:

           o      -1  -  Ignore  processes without any profiling data. This is
                  the default setting.


           o      0 - Load all processes, regardless of duration or  profiling
                  data.


           o      1  (or  larger)  - Set the minimum threshold in milliseconds
                  for profile durations. Below this  threshold,  process  data
                  will be ignored.



       For   example,   setting  SP_ANALYZER_DISCARD_TINY_EXPERIMENTS=10  will
       ignore processes that have fewer than 10 milliseconds of profile data.

REMOTE OPERATION
       Performance Analyzer can run on a local system and  connect  to  remote
       systems  where  the  Oracle Developer Studio software is installed. You
       can then profile applications  and  read  experiments  located  on  the
       remote system.


       A  subset  of the complete Performance Analyzer, called the Remote Ana‐
       lyzer, is available for local installation even  on  operating  systems
       such  as Windows or MacOS, which are not supported platforms for Oracle
       Developer Studio.


       The Remote Analyzer is distributed as a tar  file,  RemoteAnalyzer.tar,
       in  the  lib/analyzer  directory  of  the installed product. To install
       Remote Analyzer, copy the tar file to your local system and  unpack  it
       to create a subdirectory, RemoteAnalyzer.


       The  RemoteAnalyzer  directory  contains scripts for running the Oracle
       Developer Studio Performance Analyzer on systems that are not supported
       by the Oracle Developer Studio tools, or systems that are supported but
       do not have the tools installed. The  directory  also  contains  a  lib
       directory  with  the  necessary  components for Performance Analyzer to
       execute.


       There are four scripts, one each for Windows, MacOS, Oracle Solaris, or
       Linux.  The  scripts  run  Performance Analyzer on the local system and
       support connecting to a remote host. The remote host must have the Ora‐
       cle  Developer  Studio  tools  installed,  and  the  connection  dialog
       requires you to enter the path to the tools on the remote machine.

   Windows Operation
       Start Performance Analyzer on Windows  by  executing  the  AnalyzerWin‐
       dows.bat  file. You can type the command into a terminal window or dou‐
       ble-click the file in the Windows Explorer. When you  launch  Analyzer‐
       Windows.bat  from  Windows  Explorer,  it creates a terminal window and
       executes the command in that window.


       When Performance Analyzer starts on Windows, it  displays  the  Welcome
       screen.  Many of the options are grayed out because they cannot be used
       locally on the Windows system. The main use of Performance Analyzer  in
       this context is to connect to a remote host. The Welcome screen's docu‐
       mentation links work, but the links for profiling applications or read‐
       ing  or  comparing experiments will not work until Performance Analyzer
       is connected to a remote host.

   MacOS Operation
       Start Performance Analyzer on MacOS by executing the AnalyzerMacOS.com‐
       mand  file. You can type the AnalyzerMacOS.command into a terminal win‐
       dow or double-click the file in the Finder. When you launch AnalyzerMa‐
       cOS.command  from the Finder, it creates a terminal window and executes
       the command in that window.


       When Performance Analyzer starts on  MacOS,  it  displays  the  Welcome
       screen.  Many of the options are grayed out because they cannot be used
       locally on the MacOS system. The main use of  Performance  Analyzer  in
       this context is to connect to a remote host. The Welcome screen's docu‐
       mentation links work, but the links for profiling applications or read‐
       ing  or  comparing experiments will not work until Performance Analyzer
       is connected to a remote host.

   Oracle Solaris and Linux Operation
       Start Performance Analyzer on Oracle Solaris or Linux by executing  the
       corresponding Analyzer*.sh script in a terminal window.


       When  Performance Analyzer starts, it displays the Welcome screen. When
       the full Oracle Developer Studio suite is installed on the system where
       you are running Performance Analyzer, all of the Welcome screen options
       are enabled. When you run the Remote Analyzer on a  system  which  does
       not  have  the  full  suite, many of the options are grayed out because
       they cannot be used without a Studio installation on the local  system.
       The main use of Performance Analyzer in this context is to connect to a
       remote host. The Welcome screen's documentation  links  work,  but  the
       links  for  profiling  applications or reading or comparing experiments
       will not work until Performance Analyzer is connected to a remote host.

   Connecting to a Remote Host
       You can connect to a remote host from  the  Welcome  screen,  the  File
       menu,  or from the Connected status area on the status bar. The Connect
       to Remote Host dialog enables you to specify the host and login  infor‐
       mation  and  then  connect  to the remote host. In the dialog, type the
       remote host name or select a host that you've used  before,  then  type
       the  name and password for a user account that can log in to that host,
       and type the path to the installation of Oracle Developer Studio on the
       remote host. Performance Analyzer remembers the last user name and path
       to installation that you used on each host.


       Click Connect to log in to the host. Once you successfully log in,  you
       again see the Welcome Screen, but this time all the options are enabled
       as if you were directly connected to the remote host.

   Environment Variables for Connecting to a Remote Host
       SP_ANALYZER_HEARTBEAT_DISABLE

           Only used with Performance  Analyzer's  "Connect  to  Remote  Host"
           option.  If  defined,  disable  periodic checking of the connection
           between Performance Analyzer and remote host. By default, a  heart‐
           beat  packet  is sent periodically from Performance Analyzer to the
           remote back-end. When enabled, the heartbeat packet can  help  keep
           some SSH connections from automatically closing due to inactivity.


       SP_ANALYZER_HEARTBEAT_MSEC

           Only  used  with  Performance  Analyzer's  "Connect to Remote Host"
           option. Set interval for checking the  connecting  between  Perfor‐
           mance  Analyzer  and the remote host. The default is 2000 millisec‐
           onds.


       SP_ANALYZER_REMOTE_SHELL

           Set this variable to /usr/bin/ssh to use /usr/bin/ssh for  connect‐
           ing  Performance  Analyzer  to  a  remote host. If not set, use the
           internal jsch. For example:

             SP_ANALYZER_REMOTE_SHELL="/usr/bin/ssh -u test-linux-nm1.pem"


           Note -




             This variable is only available on UNIX-based systems,  and  only
             for connections that do not require a password or passphrase.





WARNINGS
       On  some Oracle Solaris systems, the X11 support which is needed is not
       installed. Performance Analyzer will report an error that it can't con‐
       nect.  The workaround is to run Performance Analyzer on a system with a
       display and X11 installed, and use the Remote Analyzer feature to  con‐
       nect to a remote host.


       Sometimes  Performance  Analyzer will report that the GC Overhead limit
       was exceeded. The workaround is to use more memory than the  default  1
       GB. To run the analyzer with 2 GB, use:

         analyzer -J-Xmx2G



       To  run  a  64-bit version of Performance Analyzer with 8 GB of memory,
       use:

         analyzer -J-d64 -J-Xmx8G



       The option -J-d64 is only needed  to  run  64-bit  when  using  Java  7
       because Java 8 is 64-bit by default on Oracle Solaris and Linux.

NOTES
       Performance  Analyzer  will  only work on experiments recorded with the
       current version of the tools. It will report an error  for  experiments
       recorded  with any other version. You should use the version of Perfor‐
       mance Analyzer from the release with which the experiment was recorded.

SEE ALSO
       collect(1),    collector(1),    dbx(1),    er_archive(1),     er_cp(1),
       er_export(1),  er_mv(1), er_print(1), er_rm(1), er_src(1), tha(1), lib‐
       collector(3)


       Performance Analyzer manual


       Performance Analyzer Tutorials



Studio 12.6                        May 2017                        analyzer(1)
맨 페이지 내용의 저작권은 맨 페이지 작성자에게 있습니다.
RSS ATOM XHTML 5 CSS3