Auditing Section Research Summaries Space

A Database of Auditing Research - Building Bridges with Practice

This is a public Custom Hive  public

Posts

  • Jennifer M Mueller-Phillips
    Finding Needles in a Haystack: Using Data Analaytics to...
    research summary posted June 26, 2017 by Jennifer M Mueller-Phillips, tagged 06.02 Fraud Risk Models, 08.09 Impact of Technology on Audit Procedures 
    Title:
    Finding Needles in a Haystack: Using Data Analaytics to Improve Fraud Predication
    Practical Implications:

    Data analytics can be used to create fraud prediction models that help auditors improve audit planning decisions. It can also be used to help regulators identify firms for potential fraud investigation. In particular, the SEC is investing resources to develop better fraud risk models and the results of this study could be useful. 

    Citation:

    Perols, Johan L., R. M. Bowen, C. Zimmermann, and B. Samba. 2017. “Finding Needles in a Haystack: Using Data Analytics to Improve Fraud Prediction”. The Accounting Review. 92.2 (2017): 221.

    Keywords:
    fraud; financial statement fraud; data analytics; predictive analytics; data rarity; data imbalance
    Purpose of the Study:

    Financial statement fraud causes organizations to lose an estimated 1.6% of annual revenue. This study examines 3 different methods that use data analytics in an attempt to predict fraud. The methods are as follows:

    • The first method, Multi-Subset Observation Undersampling (OU), addresses the imbalance between the low number of fraud observations relative to the number of non-fraud observations by creating multiple subsets of the original dataset that each contain all fraud observations and different random subsamples of non-fraud observations.
    • The second method, Multi-subset Variable Understampling (VU), addresses the imbalance between the low number of fraud observations relative to the number of explanatory variables identified in the fraud prediction literature by creating multiple subsets of randomly selected explanatory variables.
    • The third method, VU partitioned by type of fraud (PVU), is a variation of the second method that addresses issues associated with treating all fraud cases as homogenous events.
    Design/Method/ Approach:

    The sample contains data from 51 fraud firms. The authors identified fraud firms from SEC investigations that were reported in AAERS from 1998-2005. The objectives of the experiments were to determine how to best implement OU and VU and then to subsequently evaluate their performance against benchmarks. 

    Findings:

    The authors find the following:

    • When the Multi-Subset Observation Undersampling (OU) is used with 12 subsamples it improves fraud prediction by lowering the expected cost of misclassification by more than 10% relative to the best performing benchmark.
    • The Multi-Subset Variable Undersampling (VU) was found to improve fraud prediction in select situations. However, it does not do so reliably.
    • The Multi-Subsets Variable Undersampling by partitioning variables into subsets (PVU) was able to improve fraud prediction and reduce the expected cost of miscalculation by 9.6% relative to the best performing VU benchmark.
    Category:
    Auditing Procedures - Nature - Timing and Extent, Risk & Risk Management - Including Fraud Risk
    Sub-category:
    Fraud Risk Models, Impact of Technology on Audit Procedures Confirmation – Process and Evaluation of Responses
    Home:

    http://commons.aaahq.org/groups/e5075f0eec/summary

  • Jennifer M Mueller-Phillips
    Infer, Predict, and Assure: Accounting Opportunities in Data...
    research summary posted September 21, 2015 by Jennifer M Mueller-Phillips, tagged 08.0 Auditing Procedures – Nature, Timing and Extent, 08.09 Impact of Technology on Audit Procedures 
    Title:
    Infer, Predict, and Assure: Accounting Opportunities in Data Analytics.
    Practical Implications:

    This article is important to practitioners as well as academics because they will be using data analytics in accounting and auditing tasks and will need to specify system design characteristics needed to effectively accomplish these tasks. The authors identify several research questions for further study.

    Citation:

    Schneider, G. P., J. Dai, D. J. Janvrin, K. Ajayi, and R. L. Raschke. 2015. Infer, Predict, and Assure: Accounting Opportunities in Data Analytics. Accounting Horizons 29 (3): 719-742.

    Keywords:
    AIS meta-theory, data analytics, task processes
    Purpose of the Study:

    The objective of this paper is to examine how data analytics will impact the accounting and auditing environment, identify emerging management and regulatory challenges, and outline new research opportunities. To incorporate and process both structured and unstructured data to support decisions, accountants are working with a new set of sophisticated tools known as data analytics. Data analytics is the process of using structured and unstructured data through the applications of various analytic techniques such as statistical and quantitative analysis and explanatory and predictive models to provide useful information to decision-makers. Data analytics involves complex procedures that extract useful knowledge from large data repositories. Compared with conventional approaches, data analytics offer advantages in terms of cost-effectiveness), scalability, and capability to identify new patterns in real time.

    Several challenges and risks may arise with data analytics. First, how can voluminous data stored in heterogeneous and differently organized data sources be converted into structured, hence well interpretable, format? In doing so, uncorrelated data needs to be filtered out. The challenge is to identify what data needs to be filtered out. Further, how can structured data repositories be managed, processed, and transformed in order to derive needed information for decision-making purposes? Finally, data analytics applications often are highly scalable.

    Design/Method/ Approach:

    This article is a commentary. 

    Findings:

    The authors expand upon the challenges and risks via adopting the organizing principles of the metatheory of AIS and apply it to data analytics. The first principle states that data analytics research should be task-focused. Their analysis concentrated on three tasks to which accountants often apply data analytics: infer, predict, and assure. The second organizing principle notes that task requirements are the start of the process that establishes the set of system design characteristics needed. They note that the lack of accepted models of data analytics and related perceptions is a significant challenge that should be considered. The third principle suggests that the impact of data analytics on task performance should be examined within the context of cognitive, technological, and organizational contingency factors. They identify several research questions related to each of these contingency factors. Finally, the fourth principle states that the outcome of data analytics is task performance. The authors discuss how evaluating the infer, predict, and assure tasks completed with data analytics may occur at either the individual or organizational level. In addition, often the outcome of data analytics contains private and/or confidential information and more research is needed to examine how organizations can address their responsibilities to maintain privacy and confidentiality.

    Category:
    Auditing Procedures - Nature - Timing and Extent
    Sub-category:
    Impact of Technology on Audit Procedures Confirmation – Process and Evaluation of Responses
  • Jennifer M Mueller-Phillips
    Big Data in Accounting: An Overview.
    research summary posted September 21, 2015 by Jennifer M Mueller-Phillips, tagged 01.0 Standard Setting, 01.01 Changes in Reporting Formats, 01.02 Changes in Audit Standards, 08.0 Auditing Procedures – Nature, Timing and Extent, 08.09 Impact of Technology on Audit Procedures, 09.0 Auditor Judgment 
    Title:
    Big Data in Accounting: An Overview.
    Practical Implications:

    The availability of Big Data will precipitate substantive changes in accounting education, research, and practice. In education, in particular accounting and auditing, the use of Big Data will increase the statistical and IT content in curricula, probably breaking the current set of limitations represented in the CPA exam. Research in the more traditional fields in accounting, such as capital markets research, will benefit from dimensional increases in data availability and will be conditional on improvements of the researcher’s skill sets in areas such as modeling, statistics, and text mining. Practice, in particular internal audit departments, will be the leading facilitator of accounting Big Data usage, while attempting to keep abreast or in sync with the developments in corporate data utilization in fields like marketing, supply chain, and customer services.

    Citation:

    Vasarhelyi, M. A., A. Kogan, and B. M. Tuttle. 2015. Big Data in Accounting: An Overview. Accounting Horizons 29 (2): 381-396.

    Keywords:
    analytics, audit judgment, enterprise data ecosystem, reporting, standards, storage
    Purpose of the Study:

    The term Big Data is fairly new but seems to be applied in almost every area of human activity at the moment. It is not defined in the rigorous meaning of the word, and it is usually used under the assumption that the readers understand it at the intuitive level. The reason for this popularity is the exponentially growing amount of information made available by developments in computing and telecommunications technology, particularly the Internet and environmental sensing. This paper sheds light on the meaning of Big Data in the accounting and auditing domains.

    Design/Method/ Approach:

    This article is a commentary.

    Findings:
    • The definition of Big Data is conditional on the environment being used.
    • Processing needs are nonlinear with the size of data. Even small datasets may be computationally difficult if models are complex.
    • There is a progressive extension of the feasible dataset. Inclusion of sources is mainly an economic and legal issue and not one of feasibility.
    • Newly included data structures contain a wide set of not previously determined/used parameters, which by themselves may be informational.
    • Extended, nontraditional data sources may substantively change the domains of accounting and auditing.
    • Linkages of traditional extended data, as found in ERPs, to new sources of data may provide very strong confirmatory evidence for economic activity.
    • Accounting, auditing, and management extensions into Big Data usage overlap and present powerful opportunities in the next decade but also the re-conceptualization of functions in an age of computer intelligence and automation.
    Category:
    Auditing Procedures - Nature - Timing and Extent, Auditor Judgment, Standard Setting
    Sub-category:
    Changes in Audit Standards, Changes in Reporting Formats, Changes in Reporting Formats, Impact of Technology on Audit Procedures Confirmation – Process and Evaluation of Responses
  • Jennifer M Mueller-Phillips
    Toward Effective Big Data Analysis in Continuous Auditing.
    research summary posted September 21, 2015 by Jennifer M Mueller-Phillips, tagged 08.0 Auditing Procedures – Nature, Timing and Extent, 08.09 Impact of Technology on Audit Procedures 
    Title:
    Toward Effective Big Data Analysis in Continuous Auditing.
    Practical Implications:

    The Big Data qualities of Volume, Velocity, Variety, and Veracity contribute to the creation of the following Big Data Gaps: Data Consistency, Data Integrity, Data Identification, Data Aggregation, and Data Confidentiality. These Big Data Gaps create challenges for current CA systems. The paper outlines possible solutions to these gaps along with needed research topics with the aim of increasing the applicability of continuous auditing systems to Big Data. Big Data is a business phenomenon that is here to stay, and CA systems need to adapt to its challenges.

    Citation:

    Zhang, J., X. Yang, and D. Appelbaum. 2015. Toward Effective Big Data Analysis in Continuous Auditing. Accounting Horizons 29 (2): 469-476.

    Keywords:
    Big Data, continuous auditing, gap analysis
    Purpose of the Study:

    Big Data originates from traditional transactions systems, as well as new sources such as emails, phone calls, Internet activities, social media, news media, sensor recordings and videos, and RFID tags. Since much of this Big Data informs and affects corporate decisions that are important to both internal and external corporate stakeholders, auditors will need to expand their current scope of data analysis.

    Certain qualities, known as the four Vs, define the term Big Data: namely, massive Volume or size of the database, high Velocity of data added on a continuous basis, large Variety of types of data, and uncertain Veracity. Due to volume and velocity, the application of continuous auditing (CA) has become increasingly relevant for the automation and real-time analysis of Big Data. However, massive volume and high velocity also introduce gaps between the present state of audit analytics and the requirements of Big Data analytics in a continuous audit context. Moreover, variety and uncertain veracity present challenges beyond the capability of current CA methods. The purpose of this paper is to identify these gaps and challenges and to point out the need for updating the CA system to accommodate Big Data analysis.

    Design/Method/ Approach:

    This article is a commentary.

    Findings:

    The authors identify and discuss potential remediation for the five Big Data Gaps:

    • Data Consistency: Big Data systems supporting key business processes usually consist of a patchwork of different systems, where data may be fully or partially replicated, the informational content may be overlapped, and more derived data may be stored. This situation gives rise to the serious gap in data consistency.
    • Data Integrity: the volume and types of data are so expansive that it becomes more difficult to identify individual data as well as data sets that have been modified/ deleted/ hidden/ destroyed because of operating error, procedural error, illegal access, and/or network transmission failures. This difficulty in identifying integrity issues can create a domino effect that causes other reliable data to lose their value for audit analysis purposes, thus increasing audit risk in a Big Data, continuous audit environment.
    • Data Identification: refers to records that link two or more separately recorded pieces of information about the same individual or entity.
    • Data Aggregation: necessary for the normal operation of continuous auditing using Big Data and to meaningfully summarize and simplify the Big Data that is most likely coming from different sources.
    • Data Confidentiality: certain data, or more often the associations among data points, are sensitive and cannot be released to others.

    The authors identify the nine CA Challenges:

    • Audit on data with different formats
    • Audit on asynchronous data
    • Audit on conflicting data
    • Audit on illegally tampered data
    • Audit on incomplete data
    • Audit on data with various identifiers
    • Audit on aggregated data
    • Search encrypted data
    • Audit on encrypted data
    Category:
    Auditing Procedures - Nature - Timing and Extent
    Sub-category:
    Impact of Technology on Audit Procedures Confirmation – Process and Evaluation of Responses
  • Jennifer M Mueller-Phillips
    Big data analytics in financial statement audits.
    research summary posted September 11, 2015 by Jennifer M Mueller-Phillips, tagged 08.0 Auditing Procedures – Nature, Timing and Extent, 08.09 Impact of Technology on Audit Procedures, 10.0 Engagement Management 
    Title:
    Big data analytics in financial statement audits.
    Practical Implications:

    This article provides a concise introduction to Big Data analytics by providing examples of Big Data success stories in non-audit fields and drawing auditing parallels. It then indicates several characteristics of Big Data which should be considered when implementing Big Data analytics, specifically as they relate to audit procedures.

    Citation:

    Cao, M., R. Chychyla, and T. Stewart. 2015. Big data analytics in financial statement audits. Accounting Horizons 29 (2): 423-429.

    Keywords:
    Big data, analytical methods, auditing
    Purpose of the Study:

    The authors provide examples of Big Data analytics in other fields and suggest analogous auditing applications. They then briefly discuss characteristics of Big Data analytics that are specifically of relevance for the audit setting.

    Design/Method/ Approach:

    This study uses examples of Big Data in other industries to provide guidance for auditors on implementing Big Data audit analytics. There is no original analysis or unique data.

    Findings:

    The authors outline several examples of implementation of Big Data analytics in other fields and draws parallels to the audit world.

    • Using Google’s “Profile of Mood States” based on millions of tweets to predict shifts in the Dow Jones Industrial Average. The audit parallel: Using similar tools to predict bankruptcy or assess overall financial health of a firm to identify engagements/litigation risk.
    • Walmart uses sales transaction data to predict which items (surprisingly, Strawberry Pop-Tarts) have increased sales in response to dangerous weather patterns. The audit parallel: using sales trend data to identify problematic segments in scoping.
    • Ayata’s Prescriptive Analytics uses data from oil and gas drilling sites, such as images, video, sound, text, and numbers to predict optimal drilling sites. The audit parallel: Using new types of data for audit evidence to confirm existence of events and validate reporting elements.
    • The Los Angeles police department uses data from crime scenes to predict the most likely timing and location of crimes in order to deploy officers. The audit parallel: identifying fraud risks and focusing audit effort toward fraud detection.

    The authors then identify characteristics of Big Data that need to be considered when implementing analytics. They note that Big Data analytics are fundamentally different from procedures based on sampling since all data can be used. They note that Big Data helps determine that things are associated with one another, but not necessarily that one thing causes another. Lastly, they note that a key benefit to Big Data is that analytics can be continuously updated.

    Category:
    Auditing Procedures - Nature - Timing and Extent, Engagement Management
    Sub-category:
    Impact of Technology on Audit Procedures Confirmation – Process and Evaluation of Responses
  • Jennifer M Mueller-Phillips
    Behavioral implications of big data’s impact on audit j...
    research summary last edited September 11, 2015 by Jennifer M Mueller-Phillips, tagged 08.0 Auditing Procedures – Nature, Timing and Extent, 08.09 Impact of Technology on Audit Procedures, 09.0 Auditor Judgment 
    Title:
    Behavioral implications of big data’s impact on audit judgment and decision making and future research directions.
    Practical Implications:

    The authors outline potential pitfalls in the use of big data noted in auditing and psychology literature. Specifically, they focus on areas in which auditor decision making may be negatively influenced. They note that big data may lead to inefficient and/or incorrect decisions resulting from having too much information, not being able to determine what is relevant to the decision, not finding correct patterns in the data/finding incorrect patterns, and having data which may be too ambiguous to be used effectively. They then outline potential solutions to these problems, such as having decision aids, fitting the task to the system to auditor experience level, and providing contextual (“big picture”) training.

    Citation:

    Brown-Liburd, H., H. Issa, and D. Lombardi. 2015. Behavioral implications of big data’s impact on audit judgment and decision making and future research directions. Accounting Horizons 29(2): 451-468.

    Keywords:
    Big data, audit judgment, data analytics, information processing weaknesses
    Purpose of the Study:

    Big datahigh-volume, high-velocity, and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision makinghas been the new trendy topic in the future of auditing. This paper outlines potential implications on the actions of the auditor resulting from the use of big data, specifically drawing from literature on psychology and auditing.

    Design/Method/ Approach:

    This paper summarizes the current literature in psychology and auditing specifically to bring to light potential issues resulting from incorporation of big data in auditing. No data is used and no analyses are performed.

    Findings:

    The authors find four primary areas where research in auditing and psychology indicate potential negative effects of big data on audit performance, specifically in auditor judgment and decision making:

    • Auditors may make inefficient decisions, struggle to differentiate relevant information, struggle to identify relationships between detail and overall trends, and disregard large portions of information (information overload).
    • Auditors may make poorer decisions due to the excess of irrelevant information innate in big data. They may be unable to filter relevant information from the noise, thus being less efficient in analyzing data and potentially using irrelevant information in decision making (information relevance).
    • The sheer magnitude of data to analyze may force auditors to become worse at recognizing patterns, applying knowledge to the audit task, weighting evidence, or extrapolating patterns into longer times series (pattern recognition).
    • Auditorsif they have a low tolerance for ambiguitymay neglect information once a simple solution or response is identified, potentially ignoring relevant but more complex information (ambiguity).

    The authors then provide examples of solutions to these problems, such as 1) providing decision models 2) having inexperienced auditors use expert systems (requires minimal auditor interpretation) 3) provide auditors with more contextual experience and training so they can interpret patterns in big data 4) leverage predictive models that can indicate areas of increased risk.

    Category:
    Auditing Procedures - Nature - Timing and Extent, Auditor Judgment
    Sub-category:
    Impact of Technology on Audit Procedures Confirmation – Process and Evaluation of Responses
  • Jennifer M Mueller-Phillips
    Big data as complementary audit evidence.
    research summary last edited September 11, 2015 by Jennifer M Mueller-Phillips, tagged 08.0 Auditing Procedures – Nature, Timing and Extent, 08.09 Impact of Technology on Audit Procedures, 09.0 Auditor Judgment, 09.03 Adequacy of Evidence 
    Title:
    Big data as complementary audit evidence.
    Practical Implications:

    Incorporating Big Data into an audit poses several challenges. This article establishes how Big Data analytics satisfy requirements of audit evidence, namely that it is sufficient, reliable, and relevant. The authors bring up practical challenges (such as transferring information, privacy protection, and integration with traditional audit evidence) and provide suggestions for addressing them in incorporating Big Data into audit evidence. They also suggest that Big Data can complement tradition audit evidence at every level of audit evidence: financial statement, individual account, and audit objective.

    Citation:

    Yoon, K., L. Hoogduin, and L. Zhang. 2015. Big data as complementary audit evidence. Accounting Horizons 29 (2): 431-438.

    Keywords:
    Big data, audit evidence
    Purpose of the Study:

    This paper frames Big Data in the context of audit evidence, specifically looking at the requirements for something to be considered audit evidence, to provide an argument for the usefulness of Big Data to auditors. The authors address the sufficiency, reliability, and relevance of Big Data analytics; they then outline potential challenges to using Big Data for adequate audit evidence.

    Design/Method/ Approach:

    The authors summarize existing literature on audit evidence as it applies to Big Data. They perform no original analyses, but rather discuss the characteristics of Big Data analytics as they relate to regulations and research findings.

    Findings:

    The authors address:

    • Sufficiency: The authors suggest thatwhen used appropriatelyBig Data analytics can meet sufficiency requirements for audit evidence. They provide the example of using an employee’s emails to identify motivation or rationalization of fraud to demonstrate Big Data supplementing traditional audit evidence where traditional methods may be deficient in sufficiently documenting audit conclusions.
    • Reliability: Big Data, being typically from a third party and massive in nature, is argued to be generally reliable for audit evidence. They note that Big Data can validate things such as shipping terms to independently verify cutoff.
    • Relevance: The relevance of Big Data is primarily driven by the timeliness of its availability. Traditional audit evidence is often gathered after-the-fact, however Big Data-based auditing can analyze current trends to provide timely information. They provide several examples, such as using management’s discussion of forecasts. Research has linked overly optimistic press releases to fraud, so using Big Data techniques on earnings forecasts may assist in assigning fraud risk.
    • Integration with Traditional Audit Evidence: The authors acknowledge that Big Data may not always easily bridge into traditional audit evidence, however they provide a discussion of weighting evidenceas you would traditional audit evidenceso that more weight is given to the more sufficient, reliable, and relevant evidence.
    • Information Transfer: Access to data provides benefits which may be leveraged based on economies of scale, however clients may restrict access to proprietary data. The authors suggest specifically contracting for use of internal data.
    • Information Privacy: A common fear of releasing information is that it may be used for a secondary purpose. The authors acknowledge this and suggest that auditors should cooperate with information providers and ensure that information is anonymized.
    Category:
    Auditing Procedures - Nature - Timing and Extent, Auditor Judgment
    Sub-category:
    Adequacy of Evidence, Impact of Technology on Audit Procedures Confirmation – Process and Evaluation of Responses
  • Jennifer M Mueller-Phillips
    Examining the Potential Benefits of Internal Control...
    research summary posted March 9, 2015 by Jennifer M Mueller-Phillips, tagged 07.0 Internal Control, 08.0 Auditing Procedures – Nature, Timing and Extent, 08.09 Impact of Technology on Audit Procedures 
    Title:
    Examining the Potential Benefits of Internal Control Monitoring Technology
    Practical Implications:

    This study makes three primary contributions to the accounting and auditing communities of researchers, practitioners, and regulators. First, it is the first study to document economically significant benefits consistent with COSO’s assertions that formal ICM activities enhance the strength of internal control systems and the efficiency of external examinations of such internal control systems. Second, this study contributes to the literature in accounting information systems (AIS) by documenting specific benefits associated with strategically focused information technology. Finally, it enhances the IT-related auditing literature.

    However, there needs to be further research to examine the link between other types of ICM technology and specific outcomes. Similarly, it remains an open question as to how technology impacts other areas of internal control monitoring. COSO asserts the role of monitoring not only aids the financial reporting process, but also ultimately the organization’s overall system of governance, including operational decision-making. Though this study documents benefit-oriented assurance outcomes, the research area remains fruitful with respect to the impact of ICM technology on other audit quality measures.

    For more information on this study, please contact Adi Masli.

    Citation:

    Masli, A., G. F. Peters, V. J. Richardson, and J. M. Sanchez. 2010. Examining the Potential Benefits of Internal Control Monitoring Technology. The Accounting Review 85 (3): 1001-1034. 

    Keywords:
    internal control monitoring; material weakness; audit fees; audit delays
    Purpose of the Study:

    The Committee of Sponsoring Organizations of the Treadway Commission (COSO) asserts that improved ICM practices should enhance the efficiency, effectiveness, and assurance of internal control processes. In January 2009, COSO issued guidance on ICM and observed that firms often struggle with realizing the benefits of ICM-related activities. The purpose of the current study is to examine the potential benefits that firms can realize from implementing ICM technology specifically aimed at monitoring and supporting the effectiveness of their internal control systems.

    Design/Method/ Approach:

    The authors develop and test hypotheses for three explicit potential benefits associated with internal and external assurance outcomes: (1) more effective internal control systems, (2) enhanced audit efficiency, and (3) timely audit reporting. They identify 139 announcements of ICM technology purchases across the time period 2003–2006. The control group consists of all available observations listed in Audit Analytics SOX 404 Internal Controls database during the same periods.

    Findings:

    Consistent with the hypotheses, the study documents positive associations between ICM technology initiatives and subsequently stronger internal controls, enhanced audit efficiency, and timely audit reports. Collectively, the main results suggest that ICM technology yields important benefits in both internal and external assurance outcomes. In some of this research’s tests, the findings suggest transformative-oriented ICM technology initiatives yield greater assurance benefits compared to compliance-oriented ICM initiatives.

    Category:
    Auditing Procedures - Nature - Timing and Extent, Internal Control
    Sub-category:
    Impact of Technology on Audit Procedures Confirmation – Process and Evaluation of Responses
  • Jennifer M Mueller-Phillips
    A Field Study on the Use of Process Mining of Event Logs as...
    research summary posted February 15, 2015 by Jennifer M Mueller-Phillips, tagged 08.0 Auditing Procedures – Nature, Timing and Extent, 08.09 Impact of Technology on Audit Procedures 
    Title:
    A Field Study on the Use of Process Mining of Event Logs as an Analytical Procedure in Auditing
    Practical Implications:

    This paper is the first to demonstrate the value added that process mining of event logs can play in auditing. Using real data drawn from the purchasing process of a global bank we show that process mining can detect information that is of relevance to internal auditors that was missed when those same auditors examined the same data using traditional analytical procedures. These results can be attributed to two distinct differences/advantages of process mining over the standard audit procedures used by the internal auditors:

    1. The richness of the event log, which contains input and meta-data as well as a comprehensive set of attributes and its systematic arrangement by time and originator.
    2. The ability to analyze the entire population instead of being forced to use only a sample.

    The creation of event logs is a complex procedure that may require the use of consultants, but it is likely that ERP vendors will make this process more automated as process mining becomes a vital tool in operational management. Several large audit firms in Europe are beginning to offer process mining as consulting tool and are experimenting with it in external audit engagements.

    For more information on this study, please contact Michael Alles.

    Citation:

    Jans, M., M. Alles and M. Vasarhelyi. 2014. A Field Study on the Use of Process Mining of Event Logs as an Analytical Procedure in Auditing. The Accounting Review. 89 (5): 1751-1773.

    Keywords:
    process mining, analytical procedures, auditing, event logs.
    Purpose of the Study:

    In this paper, we demonstrate, using procurement data from a leading global bank, the value added in auditing of a new type of analytical procedure: process mining of event logs. Process mining is the systematic analysis of the data automatically recorded by a modern information technology system, such as the Enterprise Resource Planning systems (ERP) which form the IT infrastructure of most large and medium sized businesses today.

    Design/Method/ Approach:

    The field study location is a leading European bank which ranks among the top 25 in the world by asset size. It is also subject to provisions of the Sarbanes Oxley act because of its operations in the United States. We focus on the bank’s procurement process because it is a typical, standardized business process in most businesses around the world, and, hence, makes the field study more generalizable. Moreover, procurement represents a large expense item totaling some 1.4 billion Euros in the period covered in this field study. The transactions in the field study consist of all the invoices paid during the month of January 2007, which were then traced back to their accompanying purchase orders. This population data, which consisted of some 31,817 payments, were analyzed using a variety of data mining tools developed by process mining researchers to identify audit relevant information missed by the bank’s own internal auditors. 

    Findings:

    The bank’s internal auditors did not find any significant ICFR weaknesses with the procurement process, and judged that its SAP™ controls were appropriately set to ensure a strong control environment. By contrast, the process mining analysis identified numerous instances of audit relevant information that warranted follow-up manual investigation by the internal auditors under SAS 56:

    1. Purchase control procedures require Sign and Release for each purchase order, but the process mining analysis detected three PO’s which lacked these activities.
    2. SOD control procedures require Goods Receipt and Release not to be undertaken by the same employee, but the process mining analysis detected 175 violations of this control.
    3. The Process mining analysis detected 265 payments which lacked a matching invoice.
    4. The Process mining analysis detected three PO’s which lacked a Goods Receipt entry in the system, although the Goods Receipt indicator was flagged.
    5. Purchase control procedures require a Sign activity in all cases except when certain exceptional circumstances occur, but the process mining analysis detected 742 occurrences where a Sign activity was lacking even though the conditions for this exception were not met. 
    Category:
    Auditing Procedures - Nature - Timing and Extent
    Sub-category:
    Individual & team conduct (e.g. premature signoff - underreporting hours)
  • Jennifer M Mueller-Phillips
    Design and Evaluation of a Continuous Data Level Auditing...
    research summary posted February 15, 2015 by Jennifer M Mueller-Phillips, tagged 08.0 Auditing Procedures – Nature, Timing and Extent, 08.01 Substantive Analytical Review – Effectiveness, 08.08 Projecting Interim Testing Conclusions Year End, 08.09 Impact of Technology on Audit Procedures 
    Title:
    Design and Evaluation of a Continuous Data Level Auditing System
    Practical Implications:

    This paper is intended to prompt auditors to take advantage of easier access to population data in today’s digital business environment. By abandoning sampling auditors can develop much more sophisticated models of behavior that can identify anomalies in ways that were not possible before. Auditors can also be more creative in how they treat data, be it in aggregating it across organizational subunits or in larger and smaller time units. Most innovative of all, auditors and/or managers have the ability to continually update their expectation models by investigating errors and anomalies in real time and correcting them, so that the model is not based on flawed data. We find that such error correction greatly improves the accuracy of analytical procedures. Perhaps the most important finding, however, is that almost all the various expectation models we used gave similarly strong results which implies that what really matters is the size of the data set. Once auditors move away from sampling they will find that population data provides great statistical power when developing analytical procedures that reduces the reliance on finding just the right such procedure.

    For more information on this study, please contact Alexander Kogan.

    Citation:

    Kogan, A., M. Alles, M. Vasarhelyi and J. Wu. 2014. Design and Evaluation of a Continuous Data Level Auditing System. Auditing: A Journal of Practice and Theory. 33 (4): 221-245.

    Keywords:
    continuous auditing (CA), analytical procedures (AP), population data, auditing practice.
    Purpose of the Study:

    The purpose of this paper is to demonstrate how audit practice may change when auditors have access to real time population data, how to use real world data to develop APs for CA, and compare different analytical procedures in a CA context. In the paper we develop a framework for a continuous data level auditing system and uses a large sample of procurement data from a major health care provider to simulate an implementation of this framework. The first layer of the framework monitors compliance with deterministic business process rules and the second layer consists of analytical monitoring of business processes. A distinction is made between exceptions identified by the first layer and anomalies identified by the second one. The unique capability of continuous auditing to investigate (and possibly remediate) the identified anomalies in ‘‘pseudo-real time’’ (e.g., on a daily basis) is simulated and evaluated.

    Design/Method/ Approach:

    Our simulated implementation of the data-oriented CA system focuses on the procurement-related BPs and utilizes the data sets extracted from the data warehouse of a healthcare management business with many billions of dollars in assets and close to two hundred thousand employees. The data sets include all procurement cycle daily transactions from October 1st, 2003 through June 30th, 2004.  The number of transaction records for each activity ranges from approximately 330,000 to 550,000. Since we have access to population data, the first step is to undertake tests of details to detect violations of key controls. Once that is done we turn to determining whether there are anomalies that do not violate any established controls but which may be nonetheless indicative of potential problems.

    The implementation of the analytical procedure component of the CA system requires creation of the models of expected behavior to enable anomaly detection which we label “continuity equations” (CE). We use advanced statistical models to extract CE from the data, and then by seeding errors we determine how effectively the CE model identifies anomalies. We also investigate the effect of conducting AP on data aggregated in either time or geographically and also the implication of error correction.

    Findings:

    Our research shows that when auditors have access to population data there can be significant changes in the role and sequence of audit procedures. Since data access is not a constraint, tests of detail can be carried out first on the complete population data to find exceptions to controls and for transaction verification. Then APs can be used, again, on the complete population data, to find anomalies. This paper shows that while there are differences in the predictive ability and detection performance of various CE models, all models perform reasonably well and no single model performs better on all aspects. From this two important conclusions can be drawn: First, the choice of a particular model across the candidate CE models is less important than the fact that all models yield fairly effective AP tests.  Our second conclusion from the fact that all the CE models yield reasonably effective analytical procedures is that when auditors have access to complete transaction data, the richness of that disaggregate data combined with the reorganization of auditing workflow to implement pseudo-real time error correction makes BP problem detection robust across a variety of expectation models. In other words, it is the nature of the data that serves as audit evidence that is the primary driver of audit effectiveness, with the selection of the specific AP a second order concern—not because the audit benchmark is not important, but because auditing at the process level makes anomalies stand out much more obviously in the data.

    Category:
    Auditing Procedures - Nature - Timing and Extent, Auditor Judgment
    Sub-category:
    Evaluation of Errors - Statistical and Non-statistical, Impact of Technology on Audit Procedures Confirmation – Process and Evaluation of Responses, Substantive Analytical Review – Effectiveness

Filter by Type

Filter by Tag