Single-page applications need better auditing


Most web pentesting tools currently focus on backend exploitation (such as SQL injections, Reflected or Stored XSS, …). However, in recent years, frontend parts of applications have gained in importance to such a degree that meaningful security issues can be introduced without a real need for a backend exploit. Also, the high complexity of frontend applications has made their audit more complicated to perform and extremely time-consuming.

This article describes the difficulties that are present when auditing single-page applications (SPAs) and how a tool could help overcome them.

A PoC for such a tool, called SPAudit, is introduced in the last part of the article.

Introduction: websites have changed

Since the age of the first dynamically rendered web pages in PHP, websites have been a backend matter. JavaScript was only here to animate menus or the occasional AJAX call. Security tools have been designed based on these principles and conducting a pentest without using automated tools like these would be a counter-productive exercise. Modern tooling has enabled security auditors to perform advanced and repetitive operations with ease, greatly enhancing their performance.

However, most of these tools focus on finding and exploiting flows in applications’ backends (SQL injections, Reflected and Stored XSS, Remote code executions, …), and little effort has been made to find security issues in the frontend part of websites. This made sense in the old days of websites as static pages, but websites are a much different beast these days.

Auditing websites, then and now

Historically, website frontends were pretty limited in term of security interest, which meant auditing techniques were fairly basic:


Before the rise of single-page applications, crawling a website was only about performing HTTP requests to follow links and fill forms. Front end parts of the application were not storing any state (apart from the usual cookie) and any bot fluent in HTTP with basics of HTML could crawl and copy any website.

Modern applications store a complex state locally; accessing a view through an HTTP call is not the same as accessing it through a page mutation. Also, most of these applications would be worthless to crawl without a JavaScript environment as the underlying HTML page might only contain one single div tag.

Observing Mutations

Old-fashioned websites hardly have mutations. Most of them are either cosmetic (a small script to show/hide a component) or very simple (an event listener attached to the element). When the event listener triggered, it would execute a simple JavaScript method that might perform an AJAX call and then update a part of the view with the latest data.

Single-page applications, and their more advanced form, progressive web applications, are much more complex:

  • Most HTML elements default behaviors are overridden, for instance:
    • Forms do not directly trigger HTTP post requests
    • Links do not have an href property and trigger page mutation through an event listener
    • Any element can be turned to a text input through the contenteditable property
  • Applications hold complex state locally including cached data (in a custom cache or through a browser-provided store or database)
  • AJAX calls can be hooked and modified by hidden applications layers such as service workers
  • Part of the application logic can happen in a worker
  • Third-party scripts (such as analytics tools or browser extensions) can inject and override any expected behavior of the application

Auditing modern web applications

The growing complexity due to the transition from websites to web applications has introduced new threats against anything that runs into a browser. These threads include, but are not limited to:

  • DOM-based XSS (browser stored)
  • Improper cache cleaning (why the data of another user are still available after they closed their session)

Also, the growing use of libraries and frameworks has made black box code audits of frontend applications much more complex:

  • Huge volumes of dead code is shipped in applications
  • Most of the volume of the application is indeed third-party code
  • Pseudo-module systems (as introduced by webpack or browserify) make runtime audit of the application harder to perform

What should a modern web application auditing tool look like?

A tool designed to perform pentests on single-page applications should display several features in order to make any sense:

Browser integration

Auditing a JavaScript (or even maybe WebAssembly)-heavy application outside of a web browser would make no sense, as most behaviors of the application can’t be triggered outside of such an environment.

Ecosystem knowledge

Knowing how Redux stores the state of an application, how Vue.js renders components, or how Webpack bundles applications is the only way to reduce noise when observing the application in action.

Discoverability helpers

By definition, a pentester is not the person who built the application. Discovering event listeners and unsafe JavaScript calls (such as eval or assignment to innerHTML) should be straightforward and not require any deep dive into the DOM.

Moderate intrusivity

Injecting scripts in a web application could have unexpected outcomes. The auditor would not be certain if any behavior is present naturally or if it has been introduced by the audit tooling. DOM and JavaScript pollution should be avoided at all costs.

Mix of dynamic and static analysis

Static analysis is great for finding specific elements of an application (such as eval calls) but discovering how this call can be triggered and exploited might be close to impossible through a code review in a very complex applications environment. Dynamic analysis (based for instance on code-coverage tools) can quickly help identify code paths in an application and highlight potential tweaks to apply to obtain a desired call.

Noise reduction for dynamic analysis

When performing dynamic analysis of a specific part of an application, a pentester wants to avoid signals linked to other scripts in the page. For instance, when using code-coverage tools to evaluate the code path triggered by a click on a button, one would want to ignore that the logic to refresh ads on the page is fetching new content.

Introducing SPAudit

SPAudit is a Chrome extension that extends Chrome developer tools with pentesting features. It’s designed to bring auditing capabilities to the frontend of modern web apps. This means that an auditor can use SPAudit to find out which part of the SPA can be interacted with and which chunks of code are used when triggering an event in the page.

This tool reduces the time spent auditing an application by giving a quick understanding of critical parts of the SPA.

Current state

The extension does not inject any scripts in the audited web page and does not modify its DOM. It heavily relies on Chrome DevTools Protocol to inspect and interact with the page in the least intrusive way.

The tool currently features:

  • A search tool to identify DOM elements based on their tags and/or their event listeners
  • A static analysis tool to detect eval calls and assignments to innerHTML properties
  • The underlying technology layer needed to fully use the Chrome DevTools Protocol and perform static and dynamic analysis over instantiated scripts.

Next steps

SPAudit is currently missing some features, but in the near term, dynamic analysis helpers will be added to the tool. Much more work will have to be performed in order to reach the point where SPAudit is ecosystem-aware.


What would you expect from a single-page application audit tool? Please let me know as I am still designing which features I should add to this tool.

I will continue developing SPAudit at least until it reaches the point where I can use it daily to quickly find issues in the websites I use myself. I also probably need to use a real web framework such as Vue.js for rendering the extension. Maybe I will end up testing the tool with the tool itself!