Index

Type aliases

StatsFilterConfig

StatsFilterConfig: object | object | object

Configuration for filtering performance samples.

Variables

Const TRI_PERFORMANCE_NAME_PREFIX

TRI_PERFORMANCE_NAME_PREFIX: string = "tri"

Const extractRegExp

extractRegExp: RegExp = new RegExp(`^${TRI_PERFORMANCE_NAME_PREFIX}` +// owner name`/([^/]+)` +// function name`/([^:]+)` +// unique suffix`:([^:]+)`,// No need to handle :start/:end// because we can get that from the// sample itself.)

Const logger

logger: Logger = new logging.Logger("performance")

Functions

extractMetricName

listenForCounters

  • listenForCounters(statsLogger?: StatsLogger): PerformanceObserver
  • Start listening for performance counters. Note that you must attach the returned PerformanceObserver to some long-lived object like the window; there's some kind of bug that causes PerformanceObservers to be incorrectly garbage-collected even if they're still attached and observing.

    Parameters

    • Optional statsLogger: StatsLogger

      If given, stats will be logged directly to the given stats logger. If absent, stats will be sent to the performance_background receiver using messaging.

    Returns PerformanceObserver

measured

  • measured(cls: any, propertyKey: string, descriptor: PropertyDescriptor): PropertyDescriptor
  • Decorator for performance measuring. If performance is enabled, wraps the function call with performance marks and a measure that can be used for profiling. The mark's ownerName will be the name of the containing class and the functionName will be the name of the function. For example:

    class Foo {

    perf.measured

    function doFoos() { stuff() } }

    These counters can be obtained using listenForCounters and a StatsLogger.

    Parameters

    • cls: any
    • propertyKey: string
    • descriptor: PropertyDescriptor

    Returns PropertyDescriptor

measuredAsync

  • measuredAsync(cls: any, propertyKey: string, descriptor: PropertyDescriptor): PropertyDescriptor
  • Like the @measured decorator, but properly handles async functions by chaining a resolution onto the promise that marks completion when the function resolves its promise.

    Parameters

    • cls: any
    • propertyKey: string
    • descriptor: PropertyDescriptor

    Returns PropertyDescriptor

performanceApiAvailable

  • performanceApiAvailable(): boolean

renderStatsHistogram

  • renderStatsHistogram(samples: PerformanceEntry[], buckets?: number, width?: number): string
  • Pretty-prints a pile of performance samples of type measure (others won't work because they have duration zero or undefined) as a horizontal ASCII histogram. Useful if you just want basic statistics about performance and don't want to spend a bunch of time mucking about in python or julia.

    A very small example of what you'll get:

    0 #### 125 ########## 250 ############### 375 ###### 500 ##

    Parameters

    • samples: PerformanceEntry[]

      A set of samples to plot.

    • Default value buckets: number = 15

      The number of bins to divide the samples into.

    • Default value width: number = 80

      The width of the chart.

    Returns string

sendStats

  • sendStats(list: PerformanceEntryList): void
  • Parameters

    • list: PerformanceEntryList

    Returns void