patternjavascriptTip
performance.mark and performance.measure for custom timing instrumentation
Viewed 0 times
performance.markperformance.measureUser Timing APIPerformanceObserverRUMprofilingDOMHighResTimeStamp
Problem
console.time() and Date.now() provide rough timings but they do not integrate with the browser's performance timeline, cannot be exported to RUM analytics tools, and have millisecond resolution.
Solution
Use the User Timing API (performance.mark and performance.measure) to create named markers that appear in DevTools and can be collected with PerformanceObserver.
// Mark the start of an operation
performance.mark('data-fetch-start');
await fetchData();
// Mark the end
performance.mark('data-fetch-end');
// Create a named measure between two marks
performance.measure(
'data-fetch-duration',
'data-fetch-start',
'data-fetch-end'
);
// Read measurements
const [measure] = performance.getEntriesByName('data-fetch-duration');
console.log('Fetch took:', measure.duration, 'ms');
// Collect via PerformanceObserver for RUM
new PerformanceObserver((list) => {
list.getEntries().forEach(entry => sendToAnalytics(entry));
}).observe({ entryTypes: ['measure'] });
// Mark the start of an operation
performance.mark('data-fetch-start');
await fetchData();
// Mark the end
performance.mark('data-fetch-end');
// Create a named measure between two marks
performance.measure(
'data-fetch-duration',
'data-fetch-start',
'data-fetch-end'
);
// Read measurements
const [measure] = performance.getEntriesByName('data-fetch-duration');
console.log('Fetch took:', measure.duration, 'ms');
// Collect via PerformanceObserver for RUM
new PerformanceObserver((list) => {
list.getEntries().forEach(entry => sendToAnalytics(entry));
}).observe({ entryTypes: ['measure'] });
Why
performance.mark has sub-millisecond resolution (uses DOMHighResTimeStamp). Marks and measures appear in the DevTools Performance timeline as labeled regions, making it easy to correlate code execution with rendering events. They integrate with Real User Monitoring tools.
Gotchas
- Marks persist for the lifetime of the page — call performance.clearMarks() and performance.clearMeasures() to avoid accumulation
- Mark names must be unique within a measure call — reusing mark names creates multiple entries
- PerformanceObserver must be set up before marks are created, or use buffered: true option
- performance.now() is the low-level building block if you only need the timestamp without a named entry
Code Snippets
Generic timed() wrapper using User Timing API
// Wrap any async operation with timing
async function timed(name, fn) {
performance.mark(`${name}-start`);
try {
return await fn();
} finally {
performance.mark(`${name}-end`);
performance.measure(name, `${name}-start`, `${name}-end`);
performance.clearMarks(`${name}-start`);
performance.clearMarks(`${name}-end`);
}
}Context
When adding performance instrumentation to production code for RUM or DevTools analysis
Revisions (0)
No revisions yet.