Case study

Hounslow and Chesterfield cut response processing time by 45%

Hounslow and Chesterfield tested new software to manage consultation responses, cutting processing time by 45% and improving consistency across planning teams.

Estimated reading time: 7 minutes

London Borough of Hounslow and Chesterfield Borough Council worked together on a joint pilot to test new digital tools for managing consultation responses.

  • Outcome: Hounslow reduced the time and cost of processing responses and improved consistency across consultations which freed up officer time for engagement and plan making.
  • Scale and approach: Pilot project delivered by planning teams from both councils working jointly with an external supplier and supported by service designers.
  • Technology used: The councils worked with Urban Intelligence to develop and test new response management features within the existing PlaceMaker platform.

This was a PropTech Innovation Fund pilot and describes what was tested at the time.

The planning challenge

Local plan consultations can generate large volumes of responses in multiple formats, including emails, PDFs, documents and portal submissions. Officers often spend weeks copying comments into spreadsheets, splitting long submissions into topics, assigning work to colleagues and drafting consistent replies. This slows down the consultation process and limits the time available for community engagement and plan making.

Hounslow and Chesterfield wanted to:

  • reduce the time spent processing responses
  • improve consistency and transparency when managing comments across consultations
  • give officers clearer tools for tagging, assigning and analysing responses
  • link representations directly to sites and evidence files
  • build a system that could be reused for future consultations and integrated with their existing sites database

What they did

The councils worked together to design and test new consultation response management software within the PlaceMaker platform.

To develop the tool, they:

  • worked with Urban Intelligence to map the end-to-end process for handling consultation comments
  • identified pain points such as copying comments manually, splitting long representations into topics and maintaining consistency across consultations
  • refined the scope with support from service designers, focusing on representation processing and the preparation of consultation statements
  • ran weekly design and development sessions with the supplier to iterate quickly
  • tested functionality during Hounslow’s live consultation on a supplementary planning document (SPD), with Chesterfield carrying out early testing ahead of its local plan consultation

The new features developed through the pilot included:

  • processing responses received in multiple formats
  • automatically splitting long responses into sub-representations
  • tagging comments by theme, policy or site
  • assigning topics to individual officers
  • applying shared response templates
  • attaching GIS (geographic information system) layers and site information to comments
  • creating a single database of representations across consultations

This gave officers more information in one place and reduced the amount of manual processing required.

Results and impact

The pilot led to measurable improvements in efficiency and consistency. Hounslow found that:

  • processing time fell from 55 minutes per comment to 30 minutes
  • the new features cut officer time spent tagging and categorising comments by 45%
  • shared templates improved consistency across teams
  • linking GIS layers and sites to comments gave officers better context
  • the shared contact database could be reused for later consultations
  • officers could assign topics and track work in one place, improving coordination

What they learned

The councils found that:

  • using digital tools can significantly reduce the time spent processing representations
  • summarising and tagging comments still requires officer judgement and could benefit from AI in future
  • frequent collaboration with suppliers supports faster iteration
  • a single database of responses improves consistency and makes it easier to compare feedback across consultations
  • defining scope early is important, as initial plans were too broad
  • having user experience and interface design support helped translate needs into workable features

Future plans

Both councils intend to use the new system for future consultations and explore additional automation, including AI assisted tagging and summarising. They also plan to refine how responses are fed directly into the tool through integrated consultation modules, reducing manual copying and splitting. Chesterfield will complete live testing during its next consultation. The councils are exploring how this functionality could be expanded to other planning tasks and potentially across wider council services.

If you have feedback on this case study, you can share it using our short feedback form.

Useful resources

See tools and suppliers on the Digital Planning Directory website.

Use the Digital Citizen Engagement toolkit for step-by-step guidance on planning and running digital consultations.

Explore the Open Digital Planning (ODP) community to see examples and learning from councils using digital tools.

Read guidance and case studies on using community engagement platforms in planning consultations for more examples and practical support.

External links on this page are included to help users find relevant information. Their inclusion does not imply government endorsement of any organisation, product or service.

Updates to this page

Published 19 December 2025