MSEdgeExplainers

Web Ink Enhancement: Delegated Ink Trail Presentation Aided By The OS

Author: Daniel Libby

Status of this Document

This document is intended as a starting point for engaging the community and standards bodies in developing collaborative solutions fit for standardization. As the solutions to problems described in this document progress along the standards-track, we will retain this document as an archive and use this section to keep the community up-to-date with the most current standards venue and content location of future work and discussions.

Introduction

Achieving low latency is critical for delivering great inking experiences on the Web. Ink on the Web is generally produced by consuming PointerEvents and rendering strokes to the application view, whether that be 2D or WebGL canvas, or less commonly, SVG or even HTML.

There are a number of progressive enhancements to this programming model that are aimed at reducing latency.

Problem

Desynchronized canvas is subject to hardware limitations that may not consistently provide latency improvements that applications are depending on for great inking experiences.

There are typically two types of representation of an ink stroke: 'wet ink', rendered while the pen is in contact with the screen, and 'dry ink', which is rendered once the pen is lifted. For applications such as annotating documents, wet ink is generally rendered segment-by-segment via canvas, but it is desirable for dry ink to become part of the document's view.

Desynchronized canvas is inherently unable to synchronize with other HTML content which makes the process of drying ink difficult to impossible to implement without some visual artifacts. When the pen is lifted the application will stop drawing the stroke from the canvas and 'dry' the stroke into the document view (e.g. as SVG in HTML). When desynchronized canvas is used in this scenario, there is no guarantee that the dried content shows up in the same frame as the wet stoke is erased. This may end up producing one frame with no ink content, or one frame where both wet and dry ink are visible, which results in a visual artifact of a flash of darker for non-opaque strokes.

Solution

Operating system compositors typically introduce a frame of latency in order to compose all of the windows together. During this frame of latency, input may be delivered to an application, but that input has no chance of being displayed to the user until the next frame that the system composes, due to this pipelining. System compositors may have the capability to provide an earlier rendering of this input on behalf of the application. We propose exposing this functionality to the Web so that web applications can achieve latency parity with native applications on supported systems. This would also be a progressive enhancement in the same vein as others covered previously.

In order for the system to be able to draw the subsequent points with enough fidelity that the user does not see the difference, the application needs to describe the last rendered point with sufficient details. If the system knows the last rendered point, it can produce the segments of the ink trail for input that has been delivered, but not yet rendered (or at least has not hit the end of the rendering pipeline).

Sample app flow

Application canvas with some application rendered ink and visualized continued user input, that has not been delivered to an app

An app renders complex ink using delivered Pointer events, while user continues interaction and OS is working on delivering input to an app.

Application canvas with some application rendered ink and OS rendered ink

OS can render ink stroke based on incoming user input using last rendered point information and stroke styles set by an app, at the same time as delivering input to an app.

Application canvas with some application rendered ink, that replaced OS rendered ink

As Pointer events gets delivered to an app, application continues rendering ink, seamlessly replacing OS ink with application rendered strokes.

Goals

Non-goals

Code example

const renderer = new InkRenderer();
const minExpectedImprovement = 8;

try {
    let presenter = await navigator.ink.requestPresenter('delegated-ink-trail', canvas);
    
    // With pointerraw events and javascript prediction, we can reduce latency
    // by 16+ ms, so fallback if the InkPresenter is not capable enough to
    // provide benefit
    if (presenter.expectedImprovement < minExpectedImprovement)
        throw new Error("Little to no expected improvement, falling back");

    renderer.setPresenter(presenter);
    window.addEventListener("pointermove", evt => {
        renderer.renderInkPoint(evt);
    });
} catch(e) {
    // Ink presenter not available, use desynchronized canvas, prediction,
    // and pointerrawmove instead
    renderer.usePrediction = true;
    renderer.desynchronized = true;
    window.addEventListener("pointerrawmove", evt => {
        renderer.renderInkPoint(evt);
    });
}

class InkRenderer {
    constructor() {}

    renderInkPoint(evt) {
        let events = evt.getCoalescedEvents();
        events.push(evt);
        events.forEach(event => {
            this.renderStrokeSegment(event.x, event.y);
        });

        if (this.presenter) {
            this.presenterStyle = { color: "rgba(0, 0, 255, 0.5)", radius: 4 * evt.pressure };
            this.presenter.setLastRenderedPoint(evt, this.presenterStyle);
        }
    }

    void setPresenter(presenter) {
        this.presenter = presenter;
    }

    renderStrokeSegment(x, y) {
        // application specific code to draw
        // the stroke on 2d canvas for example
    }
}

Proposed WebIDL

partial interface Navigator {
    [SameObject] readonly attribute Ink ink;
};

interface Ink {
    Promise<InkPresenter> requestPresenter(DOMString type, optional Element presentationArea);
}

dictionary InkTrailStyle {
    DOMString color;
    unsigned long radius;
}

interface InkPresenter {
}

interface DelegatedInkTrailPresenter : InkPresenter {
    void setLastRenderedPoint(PointerEvent evt, InkTrailStyle style);
    
    attribute Element presentationArea;
    readonly attribute unsigned long expectedImprovement;
}

Interface Details

Due to uncertainty around the correct execution when setLastRenderedPoint is called before setting the stroke style, and it being likely that the radius could change frequently, we decided it may be best to require all relevant properties of rendering the ink stroke in every call to setLastRenderedPoint.

Instead of providing setLastRenderedPoint with a PointerEvent, just providing x and y values is also an option. It was decided that a trusted pointer event would likely be the better option though, as then we can have easier access to the pointer ID and the site author doesn't have to put extra thought into the position of the ink.

Providing the presenter with the canvas allows the boundaries of the drawing area to be determined. This is necessary so that points and ink outside of the desired area aren't drawn when points are being forwarded. If no canvas is provided, then the containing viewport size will be used.

The expectedImprovement attribute exists to provide site authors with information regarding the perceived latency improvements they can expect by using this API. The attribute will return the expected average number of milliseconds that perceived latency will be improved by using the API, including prediction.

Other options

We considered a few different locations for where the method setLastRenderedPoint should live, but each had drawbacks:


Related issues | Open a new issue