From the Winter 2020 issue of Pennsylvania CPA Journal.
Internet use by some 4.39 billion people1 generates quintillion bytes of data each day. Companies in all industries are adopting robotic process automation, data analytics, and artificial intelligence to harness this data and become more efficient and profitable. This feature discusses the potential regulatory impact of data analytics, a few use cases for how these tools have been implemented, and some guiding principles for implementing process automation or audit data analytics.
In the age of disruption
An Accounting Today survey from 2018 asked 800 accounting firms of all sizes questions on their expectations for the industry in 2019. Keeping up with new technology was among the top five biggest challenges recorded. Further, 13.3 percent of firms’ spending budgets was allocated to technology, with half of the respondents indicating their firms intended to increase technology spending during 2019.2 As a supplement to the survey, quotes and testimonials provided a macro-level overview for 2019. The theme that was captured was innovation. “Traditional” accounting services, such as auditing, are not going away, but the processes and procedures being used are in a state of disruption.
In June 2019, the AICPA Auditing Standards Board (ASB) released an exposure draft of proposed Statement on Auditing Standards, Audit Evidence.3 Within the exposure draft, references to data analytics, new technologies, and automated tools and techniques are abundant. A focus of the draft is to assess whether the proposed changes to the auditing standards are appropriate within the context of the pace of change surrounding the use of technology by auditors and preparers. A proposed item of particular interest is whether the current definition of sufficient and appropriate audit evidence remains relevant due to the use of automated tools and techniques. The initial proposal is to amend the definition of sufficiency to focus on the measure of the persuasiveness of audit evidence, rather than emphasizing the quantity of audit evidence. This is being driven by the prospect of technological advances that relieve the burden of digesting significant quantities of data by way of automation and audit data analytics. Traditionally, an auditor would rely upon random sampling or a coverage-based approach for detail testing. However, directing an auditor’s focus to understanding the attributes, factors, and relationships present within a data set appears to be at the forefront of proposed changes. The ASB does not believe that the use of audit data analytics alone qualifies as an audit procedure under current, discrete classifications, but it is explicitly promoting the use of automated tools and techniques in conjunction with testing or inquiries for meeting audit objectives.
ADAPT-ing to change
The AICPA assurance services executive committee’s emerging assurance technologies task force released the Audit Data Analytics to Audit Procedures mapping document (ADA mapping document) to illustrate how data analytics can be incorporated into audit engagements to replace traditional audit procedures.4 The applicable AICPA Statements on Auditing Standards, Public Company Accounting Oversight Board (PCAOB) Standards, and Audit Data Standards for each corresponding procedure are also listed. The ADA mapping document can help guide firms with initial implementation or incorporation of automated tools and techniques.
Schneider Downs, through its Automation and Data Analytics Process Team (ADAPT), has used this mapping document to assist with incorporating data analytic procedures into engagements. The ADA mapping document even recommends several data processing tools. Most of these tools allow the user to create macros, or scripts, that record procedures and repeat them automatically when initiated by a user. By reducing the time spent on data manipulation and processing, while simultaneously standardizing the procedure, scripts open up more time for value-added tasks, such as data analysis and interpretation. Overall efficiency is obviously a great benefit, but the effectiveness of procedures created by using these tools is of equal importance.
Standardizing data analytic processes provides a consistent application of procedures, ensures the accuracy of results, and aids in the interpretation of outputs. Even with the same fact pattern, multiple individuals may arrive at different conclusions depending on interpretations of the task at hand. Scripting provides comfort that the application of procedures has been consistently applied, even if used by various individuals. Standardizing data analytic processes is an important factor in becoming more efficient; however, each engagement is different and requires a customized approach based on its respective identified risks. Data analytic tools and concepts can be leveraged across multiple engagements, but they are not intended to be one-size-fits-all solutions. It is crucial to have an understanding of the underlying procedures being completed by the script so edits and adjustments can be made for appropriate application when shared between jobs. Scripting general audit procedures for tasks or for specific engagements provides a continuous benefit and can help increase profitability.
Audit data analytics and other procedures
In accordance with certain AICPA and PCAOB auditing standards, auditors are required to perform procedures on journal entries and other adjustments. Traditionally, this was completed by examining sample entries and reviewing the full journal entry population to identify entries with certain risk criteria.
With data analytics and processing tools, identification of these factors can be applied to 100 percent of the population, allowing the auditor to sample much more efficiently by extracting journal entries matching the specified criteria. Identified entries can be analyzed and discussed further with the client. As an auditor, it is important not only to document the process, but also assess and make conclusions on items determined to have higher risk qualities.
Another example is with fixed assets and their corresponding depreciation expense accounts. Traditionally, a sample of fixed assets was selected to perform a recalculation of depreciation expense based on the company’s policy as well as a high-level analysis on the fluctuation of total depreciation expense compared to the prior year or other expectations. By using data analytics, a recalculation of depreciation expense for 100% of individual fixed assets can be performed. Data analytics can provide insight into any fixed assets that had an unexpected change in depreciation expense, depreciation method, or useful life. With the application of process automation into the examples described above, analysis can be completed with the click of a button as opposed to hours of manual manipulation.
Two of the most impactful changes to accounting standards in recent history are Financial Accounting Standards Board (FASB) Accounting Standards Updates (ASU) No. 2014-09, Revenue from Contracts with Customers (ASU 2014-09), and No. 2016-02, Leases (ASU 2016-02). These ASUs have a significant impact on accounting processes as well as the audit procedures required to appropriately audit the adjustments resulting from the accounting change.
A critical part of ASU 2016-02’s adoption is identifying a complete population of leases, including lease components of non-lease agreements. To address this challenge for lessees, ADAPT developed a tool to scan the entire cash disbursement subledger to identify potential leases or lease components based on payments to vendors. This can also be applied on the cash receipts subledger for lessors. Resulting databases should be compared to the lease schedule provided by the client to ensure the listing is complete and accurate.
With the adoption of ASU 2014-09, one of the biggest impacts for many companies is the shift to recognition of revenue over time as opposed to a point in time. This is a challenge for companies that have historically recognized revenue at the time of shipment or delivery. The analysis to determine appropriate revenue recognition is different for each company, depending on the nature of the revenue streams and contracts. A standard approach is difficult to apply. One industry significantly affected by ASU 2014-09 is the transportation and logistics industry. Historically, revenue in this field was recognized upon delivery. Under the new standard, revenue is to be recognized over time as the delivery occurs. As an example, ADAPT obtained a report of all deliveries completed during the subsequent period through the date of the audit fieldwork to determine the number of days to complete the delivery. ADAPT then extracted loads that were in transit as of the measurement date and applied the proportionate amount of revenue to each period. This allowed for an analytical test of 100 percent of the population, reducing the overall risk of the adjustment and, in turn, reducing the total number of contracts required to detail test and vouch. Appropriate design and application of audit data analytics provides a customized, tailored approach for more efficient and effective audit procedures to be replicated each year.
The interpretation of results is another hurdle CPAs face, as we often share results with individuals lacking a financial background. Portraying current positions or conveying the importance of certain metrics can be challenging. Data visualization aims to provide deeper insights into data sets, while delivering easy-to-interpret results and creating efficient review processes. Dashboards are used to take these insights and display them in one place. Data visualization dashboards are also useful for planning procedures (preliminary analytics), as well as revenue analytics, key performance indicator tracking, or any other analysis of financial metrics you are interested in seeing.
Develop an action plan
Discussing ideas for using automated tools and techniques and actually putting those tools to work on real-life engagements are two very different endeavors. How do you bridge the gap from concepts dreamed up by CPAs to actionable solutions performed by CPUs?
Setting goals and objectives is critical to getting across the hypothetical bridge. What problem are you attempting to solve and how will the use of data analytics and process automation tools help you? Answering that question first will kick-start implementation, as there is now a target solution. Developing a thought-out plan, including desired outcomes, benefits, and intended process, is necessary. Injecting change into current processes will require buy-in and collaboration from all parties involved, including the support and approval of key decision makers.
A step-by-step action plan will aid in turning the vision into reality. Throughout implementation, you will be able to evaluate progress to ensure it continues to align with your company’s strategic plan. Additionally, it will help with explaining the proposed changes and why it is beneficial to those who will be performing day-to-day tasks.
Developing an action plan will force you to define what success looks like with this endeavor. Reduction of hours spent on certain tasks to increase capacity is one possible definition. Expanding service offerings and technical knowledge of staff could be another. Being transparent with intentions, goals, and the definition of success will lead to an easier evaluation in the end. It will also force realistic discussions surrounding employee resources, available skill sets, and the potential need to acquire or internally build out the appropriate infrastructure.
Infrastructure relates to employees, but also to the data processing tools that will be used. Whether the tools are directly listed on the ADA mapping document or among the many other automation or data analytic tools in the market, it is important to invest in software that fits your specific needs and helps to achieve your goals. Select a software solution that fits into your plan, not the other way around. Committing to the wrong tool early can stunt creativity and growth potential, and development of your processes could be restricted based upon software limitations.
Successful implementation of these tools (or any type of change for that matter) requires agility. Your initial results or experiences may alter your overall perspective. Continuous monitoring of goals, evaluating the process results, and an action plan must be foundational pillars for integration. Innovation doesn’t happen overnight and taking the initial steps to impart change into preexisting processes is a success in itself.
1 “Data Never Sleeps 7.0,” Domo Inc., 2019. www.domo.com/learn/data-never-sleeps-7
2 Daniel Hood, “The Year Ahead for Accounting: 2019 in Numbers,” Accounting Today (2019).
3 Proposed Statement on Auditing Standards – Audit Evidence (Exposure Draft), AICPA Auditing Standards Board (June 20, 2019).
4 Audit Data Analytics Guide, AICPA (2019).
Christopher T. Kosty, CPA, is ADAPT senior analyst for Schneider Downs & Co. Inc. in Pittsburgh. He can be reached at email@example.com.
Matthew R. Kraemer, CPA, is ADAPT manager for Schneider Downs & Co. Inc. He can be reached at firstname.lastname@example.org.