analytics dashboard

Insight
How to really use web analytics

How business workflows and product roadmaps are becoming more data driven

Problem:

Industry headwinds such as COVID, inflation, constrained fee pools, compliance and regulatory pressures have raised both the cost of change and the cost of servicing clients, making identifying revenue-making opportunities harder.

Simultaneously, services have continued undergoing digital transformation, with increased focus on cost reductions and return-on-investment.

Traditional reporting systems such as Power BI, Tableau, Wave Analytics, have often failed to integrate into everyday business decision-making processes, and often do not capture information about changing client and product requirements.

Unguided decisions in resource allocation and product strategy have led some firms to make costly mistakes and fall behind the competition.

Solution

Smart Web Apps steer business workflows towards strategic outcomes by using comprehensive web analytics.

  • Example Data-driven Web App workflows

  • Clients receive personalised trade recommendations based on pre-trade engagement, instrument discovery, trading activity, margin, entitlements, risk appetite.

  • Cross-sell opportunities are flagged and recommended in real-time as client behaviours change, together with advised entitlement changes.

  • Sales intervention is employed under particular conditions to give a product or service a competitive edge and still meet KPIs.

  • Workflows direct account managers towards specific clients, e.g. high net worth clients with high pre-trade engagement but low trade frequency

  • Authoring tools advise research analysts to cover trends identified across client searches, watchlists, portfolios and instrument discovery

  • Content management tools suggest which marketing campaigns to run based on uptake and prior performance

Furthermore product managers can use the same data to steer product roadmaps.

  • Prioritising the product roadmap from MIS data

  • Product managers can configure workflow improvements and proposed strategy changes to run in parallel in small controlled trials. Clients may also choose to opt into beta features.

  • All workflows are extensively monitored and measured against a range of outcomes and KPIs. Trial results are provided (e.g. client revenue impact) and projections provided for large-scale adoption.

How is this implemented?

Instrumentation

Smart Web Apps are finely instrumented to capture user interactions: each workflow step is measured alongside the context or 'state' of interaction. Non-functional information, e.g. time on screen, message latency, device information and other performance data is also captured.

Smart web apps use a systematic middleware approach to capture actions and the corresponding state, requiring little or no developer effort. A consistent data schema across apps helps reduce data normalisation efforts and optimises queries.

Aggregation and Persistence

Instrumentation from all channels is aggregated and persisted; events are labelled by source: by web app, messaging system, API endpoint, email services and third parties (e.g. Bloomberg, ECNs).

Analytics Engine

This is the 'smart' part. The purpose of the analytics engine to provide business and clients real-time and strategic decision-support through quantitative analysis of instrumented data.

The initial step is to tracking key KPIs. KPIs tend to take the form "increase revenue by x%", "increase flow by y%", or "increase engagement x for product y by z%".

Next, we track positive and negative outcomes, e.g. client revenue, trade frequency, accepted\rejected trades, positive responses to sales, marketing and research. Outcomes can be scalar and negative, e.g. profit or loss.

Finally, we create metrics of how instrumented events (or combination of events) relate to each outcome. The payload fields of these events (e.g. time on-screen of a trade idea) are like correlation coefficients, and can be dynamically monitored via inspection. This allows the analytics engine to identify which instrumented events are contributing toward each outcome.

Hypothesis testing

Prior to full-scale adoption, smart web apps facilitate 'A-B' testing to prove a causal relationship and measure realised gain. Here two or more versions of a workflow operate in parallel for different user segments. The scope of these changes can be small e.g. tweaking the information that appears in term sheets, or large, e.g. a new instrument discovery workflow.

Another approach, borrowed from marketing, involves time limited "campaigns" of experimental workflows, and quantifying and projecting their effectiveness.

From the test results, one may extrapolate the quantitative impact how changes in instrumented events may affect overall project KPIs, e.g. revenue. This helps business prioritise improving specific workflows in the product backlog.

Reporting

While decision-support is typically directly built into Web Apps, traditional reporting functions are sometimes needed. Recently we fed analytics reports into the llama2 LLM to answer text analytics questions, e.g. "which product types are generating the most revenue?", "which industry sectors is client x looking at?", "what was the most successful trade idea last month". LLMs currently work best picking out information from existing reports, albeit with some unexpected or erroneous responses.

We also explored LLM generation of analytics queries, to be then run against the analytics engine. Hallucinations and low error tolerance however make this approach challenging.

Conclusions

As the cost of change increases, more decisions are becoming data-driven. Web Apps now provide data-driven suggestions, to help allocate business resources, prioritise the product roadmap, and maximise the customer experience.