Embracing Proactive Management
In today’s fast-paced business environment, organizations must adopt proactive management to effectively navigate operational complexities. However, the ability to respond swiftly to emerging challenges is often hampered by domain time constraints, limited access to high-quality data, and a lack of automation. These barriers create transactional bottlenecks that slow business velocity, hinder proactive decision-making, and reduce overall productivity.
As organizations work to optimize their processes, they must recognize that saving time often demands more upfront effort—emphasising that an increased focus on operational streamlining is essential for reclaiming additional efficiency. Equally critical is a continued focus on data quality, as outdated or inaccurate information can undermine decision-making and lead to costly missteps. The path to efficiency isn’t easy—but with the right focus, businesses can turn friction into fuel for growth.
Let’s go deeper.
Lack of Time
Organizations still face many time constraints in executing both financial and non-financial processes, often due to transactional bottlenecks, resulting in missed cascading deadlines that impact more staff.
These challenges are further compounded by the ongoing constant needs of others to address urgent issues, solve complex problems, and manage ad hoc demands—all of this occurring while staff constantly juggle decision support, controls, and other critical management activities.
While saving time requires an initial investment of time, the solution isn’t as simple as adding more resources and doing more of the same. It’s more about improving efficiencies by looking more holistically within and across domain areas to improve them, whilst proactively removing transactional friction through building robust, repeatable, and auditable processes.
However, achieving this is dependent on having available knowledgeable domain resource time and a stable operating environment.
Is Headcount Resource Pressure Increasing or Decreasing?
Or have we in fact come full circle? For years, productivity initiatives, constrained by annual budgets, have paradoxically eroded the pool of skilled resources and hence available hours.
Additionally, during this time, organizations have continuously grappled with relentless crises, proliferating regulatory demands, and heightened scrutiny over data governance and cross-border flows.
Fortunately, digital transformation has emerged as a strategic counterweight to persistent cost optimization pressures in some areas, but by no means all. By streamlining workflows and dismantling organizational silos—both internally and across enterprise ecosystems—it has fostered unprecedented synergy between customer-facing and operational functions.
While these efficiency gains, particularly through tighter integration of support functions with revenue-generating operations, have created valuable organizational capacity for strategic improvements for some resource gains, we’ve only begun to scratch the surface of what’s possible.
The high-profile successes of FinTech and embedded banking solutions that enhance cash flow in sales and purchasing flows represent merely the first wave of transformation. Beneath these visible achievements lies a vast landscape of untapped potential, where numerous inefficiencies still await resolution.
Now, artificial intelligence (AI) and next-gen process automation are resetting expectations once again. The mandate is clear and strikes a familiar tone: unlock productivity without expanding payroll and ask the question before hiring: “can this job be automated?”
This means ruthlessly focusing on and eliminating transactional friction, embedding compliance into workflows, and leveraging data not just for efficiency, but as a springboard for innovation. It also means re-deploying staff into growth areas.
Having said that, to what extent has digital enablement moved organisations forwards?
Digital Enablement – Success or Failure?
Many projects in reality have ultimately fallen short of their expected outcomes. This is frequently attributed to change management teams lacking representation from all impacted areas within a projects scope, and not adequately addressing the requirements for system integration. Furthermore, this is occurring against a backdrop of flawed ROI calculations that we will explore further below.
To address these concerns, some organisations are temporarily deploying self-contained digital support teams to help, but one complication has been defining the scope of change. Too narrower a scope means limited operational improvement, and too much risks additional challenges surrounding change management. It is a tricky balance to get right when operations must be kept going.
As a result, organizations have often been unable to move forwards extensively enough to successfully tackle two critical issues. Bottlenecks remain in numbers, limiting access to timely and high quality data.
Take reporting cycles as one example: while monthly reports remain a staple, their latency and volume often render them reactive tools rather than proactive assets. This underscores a broader urgency—access to real-time data isn’t just a convenience; it’s the backbone of agile decision-making.
Let’s understand why there is some disconnected logic in ROI calculations.
Why Digital Enablement Projects Fail: The ROI Challenge
A critical barrier to success is the misalignment around Return on Investment (ROI)—either because projects stall due to unclear ROI justification or because success isn’t measured effectively if at all.
The Problem: Outdated Metrics & Silos
Companies, as we touched on, often rely on monthly reporting cycles as a crutch, limiting their metrics to generic KPIs like:
- Days to close
- Days Sales Outstanding (DSO)
- Days Payable Outstanding (DPO)
- EBITDA
- Cash Generation being Positive or Cash Negative
- Top 10 Deals for Existing and New Named Customers etc
These metrics don’t capture process inefficiencies (e.g., delays caused by manual handoffs) or cross-functional bottlenecks. Worse, they ignore the potential of integrated workflows—like combining HR and financial data—to unlock new ROI opportunities (while ensuring compliance).
Organizations are responding slowly, but perhaps AI has been timely from the perspective of opening eyes to future possibilities and focusing them on accelerated digital enablement.
Companies are thinking more about:
- Proactive management (resolving operational and process issues mid-cycle, not post-mortem).
- Cross-application workflows that drive value, especially at the intersection of application domain areas eg HR and Finance. An untapped area.
- ROI frameworks tied to actual process improvements, not just broad statements on cost savings without more detailed metrics, which in turn could drive resource re-allocations in either direction.
Thankfully, modern ProcessTech is designed to bridge these gaps to drive more effective ROI, but first we will explore why access to timely high quality data is important and why you may be overestimating your current access to it.
Accessing Timely Data
For example, many organizations still rely on monthly reporting as their primary means of identifying areas of concern that require further attention from their domain teams. However, the information contained in these reports is often highly voluminous, forcing users to conduct their own detailed reviews to uncover specific issues in need of attention—that is assuming they have the time and capabilities to do so. As a result, decisions are frequently made based on incomplete data or, even worse, on intuition.
So, part of any change management initiative has to be working through the advantages of moving to be a data driven organisation. But timeliness alone isn’t enough. Data must also be accurate, structured, and actionable to drive value.
Here’s how organizations can elevate quality:
Enhancing Data Quality
Improving quality through:-
- Leveraging Subsystems: Useful information often already exists in your systems, but simply needs to be leveraged by users. This is achieved by seamlessly accessing relevant information from sub-applications and presenting it in a meaningful way to the right person(s) after the execution of simple / complex detailed transformations that are now possible with modern process technologies. Often you will be looking at data points from the perspective of differing segments and these nuggets of information are data subsets
- Data Validations: Ensuring that accompanying transactional tags, like analysis codes, are complete and accurate, revalidating if necessary
- Data Enrichments: Being able to enrich information during day to day processing to make subsequent analysis or traceability at any time more easy. For example: including exact allocation calculation criteria or source ledger details in complex financial aggregations and consolidations
An example: in cases where analysis fields have been squeezed into non-indexable fields in legacy systems, being able to release additional untapped value by enabling them to be indexed / validated. This is particularly relevant where ad hoc fields contain meaningful structured non-validated information ie first 2 digits is XX and next 2 digits are YY etc.
- Automating Reconciliations: Executing reconciliations that enable users to identify and focus on only those areas of concern, rather than sifting through heaps of data points to identify them
- Avoiding Information Overload: Presenting only relevant actionable, contextual information to the materiality level set by the process owner, so as to save time by avoiding information overload or having to locate supporting details as an extra step
In essence, the above goes a long way to support other automations and decision making, as information by definition is of higher quality.
In fact, with this foundation in place, automation becomes the natural next step—transforming clean data into seamless workflows. The goal of course is to replace manual interventions with scalable, error-proof processes.
Driving Automation
There is a practical need for organisations to undertake full or partial automations to drive effective and efficient decision support, controls, and management activities. Achieving this typically requires extensive data transformations. As many applications will not support this level of transformation, this is often executed today within multi tabbed complex spreadsheets that slows the overall process down. This method also means that tight data integrity controls are lacking despite best efforts for this not to be the case, plus staff handovers are more complex.
Using latest process technologies, the use of spreadsheets in core areas can be eliminated to enable automated repeatable and auditable processes. The process owner can decide the level of automation (or simply mandate controlled eye ball reviews). Users can achieve access to timely high quality data.
Three areas to highlight:-
- First, process owners have the option to determine whether to incorporate spreadsheets into their process execution, recognising that vendor technologies can now ensure full data integrity within them. Alternatively, some companies prefer that spreadsheets are totally eliminated from core processes altogether for fear of introducing errors or risks of data manipulation, noting that users will still utilize them for other tasks. Both of these deployment extremes can be achieved, so it becomes a user’s choice
- Second, process owners must develop confidence that business flows are functioning as intended. This may involve going through iterative design phases that progressively enhance automation with each iteration
- Ultimately, users in the most complex transformations can go from one button to one page, further demonstrating the power of these process technologies
Automations will enable more proactive management.
Adapting to Proactive Management
Moving to a proactive data driven organisation is a mindset change that can drive value creation. For example, any inventory anomalies detected can be subject to contextual workflows directed to product managers for immediate corrective actions.
In essence, specific Apps + Smart Processes + Applications of Record combine to drive actionable, contextual workflows.
Recognise that there are other deep evolutionary changes taking place:
- Process velocity is accelerating
- Front and back office processes are becoming richer + more automated + more intertwined together with more dependencies, meaning that demarcation lines between domain areas are reducing
- Going forwards, processes will need to be proactively managed to ensure that they are fit for purposes and are achieving their desired outcomes. This in itself is another change, noting that this directly contrasts with the management of older business systems, where upgrades / iterations were often avoided as much as possible, so as not to break an already working process
- Risk management will have to be enhanced as breaches of connected systems will have a broader impact to other connected processes and extended ecosystems.
Understanding Capabilities of ProcessTech
Modern day software has been re-architected by many vendors, but certainly not all, to leverage the powerful compute capabilities of today’s current processor technologies and to enable companies to build smart ultra-granular end to end processes. To put this into perspective, this is a testimony to the combined software + processor power available to users today, which in itself was the precursor enabling GenAI.
At high level, todays smart business processes (ProcessTech) can be described as going from: data collection, through all required process flows, to produce actionable, contextual alerts / workflows @anytime @anywhere + simulations + AI + APIs / MCP’s (connections to other internal or external systems). Privacy and security considerations will also be part of process creation and long term management as these can be significantly enhanced from what was possible before.
Staff Know-How to Leverage ProcessTech
However, while process capabilities continue to increase rapidly, it is important for companies to understand that employee / company capabilities to leverage ProcessTech lags.
The reason for this is that deployment teams have to fully understand the subtleties of a process, rather than simply referring to more holistically based applications ie where functionality areas are more implied at a much higher level than factual.
This will be dependent on having i) functional domain process knowledge within and across other domain areas, and ii) a grasp on new emerging areas that can be leveraged to give more value to a process. As an example of the latter, driving simulations has always been technically possible albeit expensive, but tech capabilities today are making it far more powerful (ie calculations can be far more pervasive across areas) and at the same time accessible to all.
Added into the overall mix today for finance, is that we are in a period where fewer people are going into or remaining within the accounting profession, partly because Covid and AI have reset / re-aligned their future career expectations. This points to the fact that employment pressures will likely increase over time, or conversely be a catalyst for accelerated AI deployment.
Required skills to be successful going forwards will fit within and between functional domains, such as finance, operations, compliance and IT, which while sounding logical, will also be impacted by the erosion of entry level job experience now increasingly being undertaken by AI.
Users and managers must therefore be proactively educated on how to leverage these new systems and must also be aware of their potential weaknesses, such as AI’s tendency to hallucinate and also think through longer term implications re maintaining ongoing domain continuity.
Knowledge lags, coupled with poor team composition and integration issues, have created an early environment where digital enablement projects have failed to deliver completely against expectations as well as ROI expectations.
Let’s look deeper at ProcessTech and some areas that can save you time.
The Versatility of Modern Process Technologies
Today’s systems handle both qualitative (e.g., performance appraisals) and quantitative (e.g., financial reporting) processes with equal sophistication. They’re equally adept at:
- Specialized tasks (e.g., bonds/leases managed by a few experts).
- Enterprise-wide workflows (e.g., budgeting/FP&A involving many stakeholders).
Key Capabilities:
- Cross-functional integration
- Seamlessly connects intra- and inter-departmental workflows—a capability legacy systems lacked.
- AI-driven precision
- Enhances accuracy by pulling contextual data from third-party sources (e.g. market feeds).
Examples:
- Quantitative (FinTech):
A loan application automates:- Compliance checks (credit scores via third-party APIs)
- Contract amortization journal entries
- Corrective workflows for missed payments
- Holistic simulations to optimize customer offers
- Quantitative (Consolidation)
- Auto chase for missing report pack submissions with escalations
- Auto rejection of FP&A if specific guidelines are not followed
- Full preparation of reporting packs at each entity, with temporary iterative stages for full proof of concept
- Automatic processing of reporting packs at HQ
- Automatic ranking of variances with supporting information to the level of materiality set by the process owner
- Screen capturing for training process operators or other stakeholders
- Process audit by process owner or authorised party showing the detailed processing stages for transparency
- Applying processing time metrics. Processes can only move as fast as their slowest step. So addressing these gains incremental momentum
- Qualitative (HR):
Appraisal systems trigger:- Action reminders
- Pay rise management
- Two-way feedback loops (employee ↔ manager)
- Qualitative (Supplier Management):
- Reputation assessment (customer reviews, industry standing)
- Cultural fit evaluation (values alignment, communication style)
- Relationship health (survey feedback, responsiveness)
- Crisis adaptability (e.g., handling supply chain disruptions)
Much of the above might not leverage AI. Let’s explore why it represents another step-change leap on top of another existing step-change leap – that of ProcessTech.
Artificial Intelligence
Artificial intelligence is going to enhance your capabilities, potentially in all areas over the longer run. Deployments will be mixture of i) vendors incorporating AI into their solution sets ii) companies initiating their own AI projects for specific outcomes and iii) both vendors and companies creating algorithms from scratch or leveraging those from third parties to drive deeper value creation in their core domain areas.
Companies will have to proactively manage their AI deployments to mitigate against hallucinations, noting that algorithmic designs will contain multiple concurrent / sequential algorithms and algorithmic types to drive meaningful results.
Using procurement as a practical example of driving value in your organisation. OCR and LLMs can facilitate automated document onboarding, while other internally developed algorithms can be employed to ensure that overriding supplier discounts, which may not be reflected on purchase invoices, can still be utilized by the system to provide you value. Process owners will be augmented by this technology.
Note that LLMs are often described as bottom-up AI models, and in this case the application of overriding supplier discounts is a top down AI model which is a smaller focused model.
Algorithms are very capable / transparent in some areas, but they can be weak in others, meaning that you will need proof of concepts to validate their use. All of us are still learning, but today many AI models augment users and do not replace them.
In some situations even the creators of large LLMs cannot always prove why a model produces certain results ie hallucinations etc. Think of this as neural networks being leveraged in parallel to such a voluminous extent that how end results are reached cannot be tracked easily. No transparency.
However, conversely other industry led initiatives are also in play to help users with their assessments on understanding the accuracy of generated results. For example:-
- RAG (Retrieval-Augmented Generation) based GenAI is a technology that give users proof that the source and accuracy of AI generated data points are in fact valid and not a hallucination.
- The FinanceBench open source dataset is another source, where GenAI can search against a database of known results. This is a collaboration between AI researchers and 15 financial industry domain experts, comprising high-quality questions and answers derived from public financial documents including SEC filings (10-Ks, 10-Qs, 8-Ks), earnings reports, and earnings call transcripts. At the moment using GenAI with financial data sets is not accurate, reflecting that financial data is not probabilistic but absolute ie a number cannot be 70% right (public data is available on results using different LLMs). What this means to all is that we have to dissect the use of AI functionality to understand where and how it is adding value.
GenAI can be very powerful in augmenting user process flows in a wide number of areas supporting a process but numerical content will need to be an area of more intense validation due to the reasons explained above.
There are also other AI areas emerging that point to an ever faster cycle of technological change:-
Agentic AI: refers to processes that can self-learn, such as the AI adding missing information, or in the longer term functioning as part of a larger automated AI team of bots. This is exciting and worrying in terms of its medium to long term implications on society at the same time. As a result, their life cycle governance comes into play, meaning that their access rights will have to evolve and shift as they learn, adapt, and at some point retire.
Conceptually similar to FinanceBench, there is considerable interest in exploring how far agentic AI can be pushed in the future. This site aims to explore this and has some interesting “leaderboard” statistic(s) https://the-agent-company.com/ . Also worthy of note is the following article https://www.reuters.com/business/over-40-agentic-ai-projects-will-be-scrapped-by-2027-gartner-says-2025-06-25/ There is no doubt that progress will be fast paced from this point.
Interestingly, vendors are planning for a future of how to manage multiple bots. This may lead to HR system vendors including a new type of digital employee. For example, “process bots” that come complete with job descriptions, performance reviews, and critical information re their interdependencies with other “bots”, the latter pointing to a different type of future risk that is associated with interconnected systems that cross boundaries into other third parties.
AI’s potential is undeniable, but its true impact hinges on integration with human expertise and robust processes. The lesson? Technology alone isn’t the answer—it’s how organizations harness it that defines success.
Conclusion
To thrive in today’s fast-evolving business landscape, organizations must shift from reactive to proactive, data-driven management. By addressing time constraints, improving data quality, and leveraging automation, businesses can unlock efficiency and agility. Modern ProcessTech and AI offer transformative potential—streamlining workflows, enhancing decision-making, and mitigating risks—but success hinges on employee upskilling and cross-functional collaboration.
The future belongs to organizations that embed automation, integrate real-time insights, and measure ROI beyond cost savings. From finance to HR, interconnected workflows will drive innovation, while AI augments—but does not replace—human judgment. However, change must be intentional: clear metrics, iterative testing, and adaptive leadership are critical.
The path forward is clear: break silos, invest in smart processes, and foster a culture of continuous improvement. Those who act now will not only optimize operations but also future-proof their competitive edge. Is your organization automating workflows—or still firefighting monthly reports?