TECHNOLOGY
spreadsheets have reshaped how accountancy professionals work, famous examples of errors persist, such as the misplacement of nearly 16,000 positive Covid-19 tests due to a limit on the number of lines a spreadsheet could accommodate. Once AI becomes the norm, there’s a risk that it becomes relied on by professionals without them really understanding what it’s doing and why. While AI may reduce the risk of user input errors, new risks may be introduced, such as ‘hallucinations’ by AI systems. When machine learning tools are being applied, employers need confidence that the inputs are technically correct, and that the system is interpreting the data correctly, and then subsequently flagging potential errors for human review. A key challenge is ensuring appropriate human oversight and transparency. AI should complement human judgment, not replace it entirely. Of course, HMRC “Artificial intelligence should complement human judgment, not replace it entirely”
and categorisation process by analysing huge quantities of data and determining whether an expense qualifies for inclusion in a PSA. At its most basic application, a simple decision tree may be used, as with the drinks example above. A variety of classification criteria can be designed to quickly review expense and payables data to identify and categorise items ready for inputting into the PSA calculation. With appropriately designed and robust criteria, this can change copious amounts of work into a process which can be completed monthly at the touch of a button. Further enhancements in AI, such as adaptive and machine learning models, are increasingly being used in PSA tools. These provide a dynamic system, which learns the appropriate categorisation of data, based on historical patterns, business specific narratives, changes in tax rules and user input. For example, learning that drinks for a specific intermediary are in fact non-taxable client entertaining expenses for PSA reporting purposes, or that a reimbursement of certain expenses should now be treated as taxable. Challenges and considerations When considering the risks, parallels can again be drawn to the introduction of the spreadsheet in 1979. Although
already uses AI to enhance its tax compliance efforts and has done for at least a decade. One such example is HMRC Connect, used to aggregate data from various sources to identify risk, which then triggers action. However, HMRC and employers will both need to understand that AI has its limits, in order to maintain trust in the technology. For example, on a compliance check by HMRC, what data and audit trail will be available to demonstrate that the data has been categorised correctly to HMRC? What’s next? As AI continues to evolve, its impact on PSAs will grow. Employers should embrace AI as a powerful ally in managing tax obligations, while ensuring transparency and compliance. In a professional environment, increasingly focussed on appropriate processes, procedures and controls, AI can act as a powerful risk and compliance management tool, empowering employers to review larger quantities of data than ever, in a fraction of the time. However, it’s likely that the future of PSAs lies at the intersection of human expertise and AI-driven efficiency, with some machine learning. AI’s role in PSAs is not about replacing humans, but about empowering them to work more efficiently, eliminating tedious tasks and making smarter, data-driven decisions. Ultimately, the technology is intended to provide the means to simplify and improve employer compliance processes and controls. n
“Once artificial intelligence becomes the norm, there’s a risk that it becomes relied on by professionals without them really understanding what it’s doing and why”
43
| Professional in Payroll, Pensions and Reward |
Issue 99 | April 2024
Made with FlippingBook - Online magazine maker