Feature-Specific Profiling
1 Quick Start
2 Using the Profiler
feature-profile
feature-profile-thunk
feature-profile-compile-handler
default-features
2.1 Default features
contracts-feature
output-feature
generic-sequence-dispatch-feature
type-casts-feature
keyword-optional-arguments-feature
pattern-matching-feature
send-dispatch-feature
3 Adding Profiling Support to Libraries
3.1 Feature Instrumentation
make-latent-mark-compile-handler
default-syntactic-latent-mark-keys
default-functional-latent-marks
3.2 Implementing Feature Plug-Ins
feature
3.3 Implementing Feature-Specific Analyses
feature-report
print-feature-profile
make-interner
8.12

Feature-Specific Profiling🔗ℹ

 (require feature-profile) package: feature-profile

This package provides experimental support for profiling the costs of specific language and library features.

Unlike Racket’s regular profiler, which reports time spent in each function, the feature-specific profiler reports time spent in feature instances: a particular pattern matching form, a specific contract, etc. This pinpoints which uses of expensive features are the sources of performance issues and should be replaced by less expensive equivalents. Conversely, the feature-specific profiler also reports which feature instances are not problematic and should be left alone.

Out of the box, the profiler includes plug-ins for the following features:
  • Contracts

  • Output

  • Generic sequence dispatch

  • Typed Racket casts and assertions

  • Pattern matching

  • Method dispatch

  • Optional and keyword arguments

Libraries may supply additional plug-ins for the features they introduce. For example, the Marketplace library includes a process accounting plug-in.

1 Quick Start🔗ℹ

The simplest way to use this tool is to use the raco feature-profile command, which takes a file name as argument, and runs the feature-specific profiler on the main submodule of that file (if it exists), or on the module itself (if there is no main submodule).

Alternatively, wrap the code you wish to profile with the feature-profile form. Then, execute the program with the feature-profile-compile-handler active.

Some features, such as contracts, can be profiled without using the compile handler. To get information about all features, the compile handler is required.

This can be accomplished by running the program using
  racket -l feature-profile -t program.rkt
Note that any compiled files (with extension .zo) should be removed before running.

Here is an example feature profile:

"fizzbuzz.rkt"

#lang racket
 
(require feature-profile)
 
(define (divisible x n)
  (= 0 (modulo x n)))
 
(define (fizzbuzz n)
  (for ([i (range n)])
    (cond [(divisible i 15) (printf "FizzBuzz\n")]
          [(divisible i 5)  (printf "Buzz\n")]
          [(divisible i 3)  (printf "Fizz\n")]
          [else             (printf "~a\n" i)])))
(feature-profile
   (parameterize ([current-output-port (open-output-nowhere)])
      (fizzbuzz 10000000)))

513 samples

 

 

Output

account(s) for 54.7% of total running time

4240 / 7752 ms

 

Cost Breakdown

  3354 ms : fizzbuzz.rkt:13:28

  620 ms : fizzbuzz.rkt:12:28

  134 ms : fizzbuzz.rkt:11:28

  132 ms : fizzbuzz.rkt:10:28

 

 

Generic Sequences

account(s) for 14.47% of total running time

1122 / 7752 ms

 

Cost Breakdown

  1122 ms : fizzbuzz.rkt:7:11

Some of the reports for contracts are produced in separate files. See Contract Profiling.

The first part of the profile reports the time spent performing output, and lists each call to printf separately. The second part reports the time spent doing generic sequence dispatch. The for form on line 5 operates generically on the list returned by range, which introduces dispatch.

2 Using the Profiler🔗ℹ

syntax

(feature-profile [#:features features] body ...)

 
  features : (listof feature?)
Reports costs of feature instances and the overall costs of each feature to current-output-port.

For each profiled feature, reports how much time was spent in the feature overall, then breaks down that time by feature instance. Features are sorted in decreasing order of time, and only features for which time was observed are displayed.

The optional features argument contains the lists of features that should be observed by the feature profiler. It defaults to default-features.

procedure

(feature-profile-thunk #:features features    
  thunk)  any
  features : (listof feature?)
  thunk : (-> any)
Like feature-profile, but as a function which takes a thunk to profile as argument.

procedure

(feature-profile-compile-handler stx 
  immediate-eval?) 
  compiled-expression?
  stx : any/c
  immediate-eval? : boolean?
Compiles stx for feature profiling. This adds intrumentation around feature code, which introduces a slight overhead (less than 20%).

value

default-features : (listof feature?)

The default set of features observed by the profiler.

2.1 Default features🔗ℹ

value

contracts-feature : feature?

value

output-feature : feature?

value

generic-sequence-dispatch-feature : feature?

value

type-casts-feature : feature?

value

keyword-optional-arguments-feature : feature?

value

pattern-matching-feature : feature?

value

send-dispatch-feature : feature?

Individual features provided by the feature profiler. To use all of the features, use default-features.

3 Adding Profiling Support to Libraries🔗ℹ

Not all expensive features come from the standard library; some are provided by third-party libraries. For this reason, feature-specific profiling support for third-party library features can be useful.

Adding feature-specific profiling for a library requires implementing a feature-specific plug-in and adding feature marks to the library’s implementation (see Feature Instrumentation).

3.1 Feature Instrumentation🔗ℹ

Implementing feature-specific profiling support for a library requires instrumenting its implementation. Specifically, library code must arrange to have feature marks on the stack whenever feature code is running. This is usually a non-intrusive change, and does not change program behavior. Feature marks allow the profiler’s sampling thread to observe when programs are executing feature code.

Conceptually, feature marks are conditionally enabled continuation marks that map feature keys to payloads. Feature keys can be any value, but must uniquely identify features and must be consistent with with the feature’s plug-in (see Implementing Feature Plug-Ins).

Payloads can also be any value (except #f), but they should uniquely identify feature instances (e.g. a specific pattern matching form). The source location of feature instances is an example of a good payload. Payloads with additional information can be useful when implementing more sophisticated feature-specific analyses (see Implementing Feature-Specific Analyses).

To avoid attributing time to a feature when feature code transfers control to user code (e.g. when a feature calls a function that was provided by the user), you can install antimarks which are feature marks with the 'antimark symbol as payload. Antimarks are recognized specially by the profiling library, and any sample taken while an antimark is the most recent feature mark will not contribute time toward that feature.

Feature marks come in three flavors, each with different tradeoffs and appropriate for different use cases.

procedure

(make-latent-mark-compile-handler latent-mark-keys 
  functional-latent-marks) 
  (-> any/c boolean? compiled-expression?)
  latent-mark-keys : (listof any/c)
  functional-latent-marks : dict?
Creates a compile handler similar to feature-profile-compile-handler that supports the provided kinds of latent marks. Compile handlers produced by make-latent-mark-compile-handler turn latent marks with keys it recognizes into continuation marks, allowing the corresponding features to be profiled.

The first argument is the list of feature keys used by the latent marks that should be activated. It should usually be an extension of default-syntactic-latent-mark-keys.

The second argument is a dictionary mapping the names of functions whose calls should be profiled to the key of the feature that the function corresponds to. It should usually be an extension of default-functional-latent-marks.

value

default-syntactic-latent-mark-keys : (listof any/c)

The list of syntactic latent mark keys used by features supported out of the box by the feature-specific profiler.

Dictionary mapping feature-related functions profiled by default to the feature keys their marks should use.

3.2 Implementing Feature Plug-Ins🔗ℹ

 (require feature-profile/plug-in-lib)
  package: feature-profile

At its core, a plug-in is a feature struct.

struct

(struct feature (name key grouper analysis)
    #:extra-constructor-name make-feature)
  name : string?
  key : any/c
  grouper : (or/c #f (-> any/c any/c))
  analysis : (or/c #f (-> feature-profile? any))
name is a string representing the user-facing name of the feature, and is printed in the profile.

key is the continuation mark key used by the feature’s feature marks. It can be any value, but must be consistent with the key used by the feature’s instrumentation.

grouper is a function that should be used to group mark payloads that correspond to a single feature instance. The function takes a payload as argument, and returns a value that identifies the payload’s equivalence class. That is, all payloads that should be grouped together must return the same value. The grouping function only considers the payload of the most recent mark for a given feature.

A value of #f will result in the plug-in using the default grouping functions, which is usually what you want. The default function expects that the payloads will be a vector holding the fields of a srcloc.

analysis is a function that performs feature-specific analysis to present profile results in a custom format. It is expected to produce its results as a side-effect. Writing analysis functions is covered in Implementing Feature-Specific Analyses.

A value of #f will result in the plug-in using the default analysis provided by the profiling library. This analysis groups costs by feature instance and prints them to standard output. It is usually best to use this option to start with, and eventually migrate to a more sophisticated analysis.

The most basic feature plug-in is a feature struct with a name and a key, and uses the default grouping and analysis. To include the new feature when profiling, pass the feature struct to feature-profile’s or feature-profile-thunk’s #:extra-features argument.

3.3 Implementing Feature-Specific Analyses🔗ℹ

 (require feature-profile/plug-in-lib)
  package: feature-profile

While the basic analysis provided by default by the profiling library (grouping costs by feature instance) is useful, instances of some features (such as contracts) carry enough information to make further analysis worthwhile. Further analysis can be used to produce precise reports tailored specifically to a given feature.

This section describes the API provided by the profiling library for building feature-specific analyses. This API is still experimental, and is likely to change in the future.

A feature-specific analysis is a function that takes a feature-report structure as an argument and emits profiling reports as a side effect. This function should be used as the analysis field of the relevant feature object.

struct

(struct feature-report (feature
    core-samples
    raw-samples
    grouped-samples
    total-time
    feature-time)
    #:extra-constructor-name make-feature-report)
  feature : feature?
  core-samples : (listof (listof any/c))
  raw-samples : any/c
  grouped-samples : (listof (listof (cons/c any/c any/c)))
  total-time : exact-nonnegative-integer?
  feature-time : exact-nonnegative-integer?
Feature report structures contain information about a feature gathered during program execution. Information is available in raw form (what’s directly gathered via sampling) and at various stages of processing by the core profiler. This information can be used by feature-specific analyses to generate custom reports and views.

The feature field contains the feature structure that corresponds to the feature being profiled and analyzed.

The core-samples field contains feature core samples, that is each sample is a list of all marks related to the feature of interest that are on the stack at the time the sample is taken. The most recent mark is at the front of the list. These core samples include antimarks.

The raw-samples field contains the samples collected by the regular Racket profiler during program execution, in the (intentionally) undocumented format used by the profiler. These samples can be passed to the regular profiler’s analyzer, and the result correlated with feature data.

The grouped-samples field contains a list of groups (lists) of samples, grouped using the feature’s grouping function. Each sample has the most recent payload as its car and a sample in the regular profiler’s format as its cdr.

The total-time field contains the total time (in milliseconds) observed by the sampler.

The feature-time field contains the time (in milliseconds) for which a feature mark was observed, as estimated by interpolating sample timestamps.

This library also provides helper functions for common analysis tasks.

procedure

(print-feature-profile f-p)  void?

  f-p : feature-report?
Displays the results of the built-in basic analysis. Useful when the custom analysis for a feature supplements (but does not completely replace) basic analysis.

procedure

(make-interner)  (-> (cons/c any/c any/c) any/c)

Produces an interning function that takes a name-location pair (e.g. from the regular profiler’s samples) and interns it. The regular profiler’s analysis requires name-location pairs to be interned, hence this function.