JIS Senior Editors' Blog

Journal of Information Systems

This is a public blog  publicRSS

Posts

  • Roger Debreceny
    Recent publication - Exception Prioritization in the...
    blog entry posted August 26, 2016 by Roger Debreceny 

    Exception Prioritization in the Continuous Auditing Environment: A Framework and Experimental Evaluation

    A paper that is coming out in a  future issue isPei Li, David Y. Chan and Alexander Kogan  Exception Prioritization in the Continuous Auditing Environment: A Framework and Experimental Evaluation. Here is a blog post by the authors.

    An important potential benefit of a continuous auditing system is the improvement of the efficiency of auditing through the automation of audit procedures.  However, the inherent nature of a continuous auditing system may in fact diminish any economic benefits from automation.  Researchers have found that the large volume of exceptions generated by a continuous auditing system can be overwhelming for an internal audit department.  Exceptions are irregular or suspicious transactions, or internal controls violations identified by the continuous auditing system that need to be manually investigated by the auditor.  As a result, a large number of exceptions diminishes the economic efficiency gained through automation. 

    In this study, we propose a framework that systematically prioritizes exceptions based on the likelihood of an exception being erroneous or fraudulent.  The framework is based on an initial set of rules that are generated by internal auditors to detect irregular transactions.  These rules are assigned a confidence level depending on their effectiveness in detecting errors or fraud.  The continuous auditing system identifies transactions that violate a single rule or multiple rules and labels those transactions as exceptions.  The suspicion score of each of these exceptions is generated using the Dempster-Shafer theory of belief functions.  Then, the auditors are guided to investigate those exceptions that have the highest suspicion scores.

    The framework incorporates an advanced feature that learns from identified errors and fraud after each iterative process through employing the following two methods.  First, the confidence level of a rule that contributed to finding erroneous or fraudulent transactions is revised on the basis of investigative results of auditors.  Second, a rule learner algorithm is implemented to add new rules to the original set of rules that were developed by the auditors.  Although the confidence levels of the auditor-based rules have been refined, the newly identified erroneous and fraudulent transactions might have new attributes, which are not represented in the existing rules.  This method captures the attributes of errors and fraud to create new rules that will attempt to find similar instances subsequently.

    The framework consists of six stages: 1) generation of exceptions using defined rules, 2) assignment of suspicion scores to exceptions using belief functions, 3) exception prioritization, 4) exception investigation, 5) rule confidence level update utilizing back propagation, and 6) rule(s) addition utilizing a rule learner algorithm.  

    We validated the proposed framework using a simulated experiment. The experiment used accounts payable transactional data from a technology company and simulated irregular transactions.  The results from the experiment provide evidence that the proposed framework has the ability to effectively prioritize erroneous or fraudulent transactions.  Furthermore, the results indicates that using back propagation to refine the confidence levels of rules and using a rule learner algorithm to generate additional rules helped improve the effectiveness of the prioritization of exceptions in subsequent iterations of the process.

  • Roger S Debreceny
    In a sea of Big Data, what becomes of accounting and...
    blog entry posted March 23, 2016 by Roger S Debreceny, tagged research 

    Dr A. Faye Borthick at Georgia State University is the co-chair of  JISC2016 - 2nd Journal of Information Systems Research Conference, to be held at the offices of Workday, Inc., Pleasanton, CA on October 13 &14. The theme of the conference is An Accounting Information Systems Perspective on Data Analytics and Big Data

    In this commentary, Dr Borthick sets out some of the issues that are relevant for the conference.

     

    Dr A. Faye Borthick

    Dr A. Faye Borthick

    In a sea of Big Data, what becomes of accounting and auditing?
    A. Faye Borthick

    Evolution of Big Data and Its Insinuation in Organizations

    Big Data and the software for doing interesting things with the data have developed far enough that some trends have emerged. People are clever. Leave them alone with resources, and they will do interesting things with them, giving both intended and unintended consequences. This commentary highlights the landscape of Big Data, not the technical aspects per se but how organizations are starting to use data in different ways. While it is true that some of what this commentary offers does not, strictly speaking, require Big Data with respect to volume, diversity, and structure, the connotations that Big Data bestowed have prompted new ways to stage and use data. For example, “70% of firms now say that big data is of critical importance to their firms” (Malone 2016 A17).

    We invite you to reflect on burgeoning data and its emerging uses highlighted below as you consider innovative practice on aspects related to data analytics and Big Data.

    Accounting

    Managers and investors are tantalized by the prospects of using more data to make companies more profitable and to make organizations more responsive to their constituents (Dwoskin 2014). The growing number of fintech company startups making loans based only on digital data illustrates the phenomenon (Rudegeair 2015). Furthermore, ideas about how to take advantage of data can come from anywhere. This is the phenomenon that propels startup companies into billion dollar IPOs (initial public offerings of stock) in just a few years.

    With new insights into consumer/buyer behavior and product performance, companies and organizations have been compelled to accelerate changes to their business processes to stay current with their product and service offerings (Zetlin 2015). Instead of weeks, General Motors’ profitability analysis of the Chevrolet brand in Europe took only days (Monga 2014). Food companies have noticed declines in sales of the worst offenders of packaged foods associated with packing on the pounds (Esterl 2016) or foods containing GMOs (Brat 2015). How fast can packaged food makers adjust to such a shift? When Ben & Jerry’s set out to source only organic products for its ice cream, it discovered that a key ingredient, organic milk, was not available in sufficient quantity for its volume (Gasparro 2014). The company is caught between consumers whose tastes change rapidly and a supply chain that requires years to evolve.

    Much though companies aspire to change their information systems to take advantage of the business insights that analysis of data can afford, they are hindered by existing systems that have accreted over decades. The bigger and older the company, typically the more systems a company has and the less they are integrated. For example, General Electric hired Amazon to help it “reduce internal applications to 5,000 from more than 9,000 and move them to Amazon and other cloud services,” in order to “allow GE to eliminate 30 of its 34 worldwide data centers and roll out new applications in as little as five minutes” (McMillan and Barr 2015 B5). Other companies such as Whole Foods Market Inc. and Wal-Mart Stores Inc. are “plowing into years-long efforts to merge disparate data sets, in the hope of extracting cost savings and insights about customers” (Norton 2015 B4).

    Where do the data come from that companies and organizations want to analyze? Some of it is the familiar transaction data in accounting systems, usually highly structured with careful editing before it is permitted to enter the accounting system. Some of the data is about the location or behavior of things, e.g., RFID (radio frequency ID) data, recorded in real time indicating the existence of products or pallets of products in a specific location at a specific time. These data are usually highly structured. Then come the unstructured data of interactions, often from social media, whose existence has spawned whole new categories of analysis.

    Lest one think that these data capabilities are only available to large companies, consider the experience of some startups in which middle management has been subsumed in data. Instead of hiring people to get the data needed for decisions (the traditional middle management role), the startups make extensive data dashboards available to everyone in the company. Middle managers are not needed to gather information and make decisions because “every employee can have the tools to monitor progress toward any goal” (Mims 2015). The transparency and accountability afforded by all-employee access to dashboards means that leaders can find out how the business is performing directly without relying on middle management.

    Where do the data live? More and more in the cloud, of course, with a dashboard interface most likely designed by the cloud-services provider or a consultant. This approach arose first in startups because it allowed them to run lean, minimizing headcount. Big companies can move to dashboards, but the costs of taming non-integrated systems loom large, and corporate cultures will have to be transformed. Big companies with their non-integrated legacy systems realize that time is of the essence because startup companies commence operations with fully integrated systems run from the cloud (Ismail et al. 2014; Loten 2015).

    What are the implications for employment of the people that now summarize, categorize, and report data? Merchandising staff are discovering that management wants to rely more on data analysis than instinct for product selection, much to the dismay of chief merchants, “once lionized for their knack for spotting trends, are finding their intuitions being displaced by algorithms” (Kapner 2015 B7). Floor stock traders have been replaced with software, and now financial analysts are being replaced with software (Popper 2016).

    Where is the accountant in this uncharted sea? Because accountants understand traditional accounting data, they are uniquely poised to analyze it, including the related location and interaction data. As more accounting functions are automated in software, managers and investors are expecting higher levels of analysis of all the data. Investors are eager to spot business trends that portend changes in revenue. Thus, they are interested in feeds of transaction data, e.g., summaries by day of consumer purchases at publicly traded companies . The transactions are captured in the normal course of business but become useful for non-transaction purposes, a strategy known as cashing in on “exhaust, ” i.e., “data collected while doing other business” (Hope 2015).

    Auditing

    What happens to auditing when data volumes grow, when data about interactions and observations become available in addition to the traditional transaction data? Transaction data are typically well structured, which makes them amenable to analysis in relational database systems. Interaction data, e.g., from social media or other sources, are typically unstructured. Data about observations may be structured, e.g., RFID data or logs containing process events, or unstructured depending on the context, e.g., comments appearing in logs of process events.

    Data proliferation challenges auditing because auditing has been formulated and conducted in a world in which data were limited and there were no good software tools for analyzing large volumes of data. But conditions are changing.

    Data limitations in auditing gave rise to sampling as a way to obtain evidence about account balances and flows. In the absence of data and computational software, auditors developed manual procedures based on sampling a small number of items (transactions usually) and checklists to ensure that lower level staff could execute audit procedures. But data limitations are falling away. When they know the whole population of data could be analyzed, people just laugh at the sampling mentality. Instead, they want data analytics applied to the whole population to make the data give up their secrets (Murphy and Tysiac 2015). This presents a problem to auditors in that they have decades invested in a sampling/checklist/procedural approach to auditing in a time when their constituents want data analytics applied to whole data populations. Eventually auditing standards premised on sampling will be revised to embrace better evidence (Titera 2013).

    As auditors have dived deeper into company data, the Public Company Accounting Oversight Board (PCAOB) has stepped up its scrutiny of auditors’ testing of system-generated data and reports as a means of prompting auditors to detect more of the latent internal control deficiencies. In essence, the PCAOB is demanding that auditors vouch for controls over “the accuracy and completeness of the system-generated data or reports” (Munter 2015). Thus, even as auditors are facing more data, they are being pressed to detect deficiencies in internal control over the data on which they rely for evidence.

    Accounting firms are taking advantage of growing data availability and increasing software capabilities to create dashboards populated with operating and other related data streams (PwC 2015). The purpose of the dashboards is twofold: to facilitate auditors thinking analytically about risks and their instantiation in data patterns and to enable drilling down through the data to look for underlying causes for anomalies or for business opportunities. This approach can be called an analytics mindset.

    Auditors are shifting from a sampling mentality to a data analytics approach as a competitive necessity. Their investments to reorient audit methodologies are large, in developing auditing based in analytics, retraining staff, and seeking and cultivating an analytic mindset in new staff.

    Education

    The ripple effect of Big Data on university level education for accountants comes through calls from employers for entry-level accountants and auditors with analytical skill sets. The business press has documented the shift from armies of people tracking and paying for orders to automation of the task (Monga 2015). “Since 2004, the median number of full-time employees in the finance department of big companies has declined 40% to about 71 people for every $1 billion of revenue” (Monga 2015).

    If manual entries, which used to require armies of people, have been automated, what skills do entry-level accountants need? A typical response usually includes a variant of “analyze data and present findings coherently to colleagues “ (Johnson 2015). The data analytics response has been written into AACSB Standards for Accounting in the form of Standard A7 on data analytics (AACSB 2013).

    References

    AACSB. 2013. Eligibility Procedures and Accreditation Standards for Accounting Accreditation Standard A7. Tampa, FL: Association to Advance Collegiate School of Business. Accessed March 30, 2014: http://www.aacsb.edu/accreditation/standards/2013-accounting/Learning%20and%20Teaching%20Standards/standard7.aspx.

    Brat, I. 2015. Food goes 'GMO free' with same ingredients. The Wall Street Journal, B1 (August 21).

    Dwoskin, E. 2014. Tons of data. Now put it to use. The Wall Street Journal, R6 (October 20).

    Esterl, M. 2016. As sales fizzle, pop makers bill more for a sip. The Wall Street Journal, B1 (January 28).

    Gasparro, A. 2014. How we eat: GMO fight ripples down food chain. The Wall Street Journal, A1 (August 8).

    Hope, B. 2015. Firm tracks cards, sells data. The Wall Street Journal, A1 (August 7).

    Ismail, S., M. S. Malone, and Y. Van Geest. 2014. Exponential Organizations: Why New Organizations Are Ten times Better, Faster, and Cheaper Than Yours (and What To Do About It). New York, NY: Diversionbooks.

    Johnson, K. S. 2015. Outdated: The plain-vanilla accountant. B7 (May 19).

    Kapner, S. 2015. Data pushes aside chief merchants. The Wall Street Journal, September 23, B7  (September 23).

    Loten, A. 2015. Cloud tools tackle new tasks. The Wall Street Journal, B6 (June 4).

    Malone, M. S. 2016. The Big-Data future has arrived. The Wall Street Journal, A17 (February 23).

    McMillan, R., and A. Barr. 2015. Google taps director for cloud push. The Wall Street Journal, B5 (December 24).

    Mims, C. 2015. Data is now the new middle manager. The Wall Street Journal, April 20, B1, B2 (April 20).

    Monga, V. 2014. Big data chips away at cost. The Wall Street Journal, B6 (July 1).

    ———. 2015. The new bookkeeper is a robot. The Wall Street Journal, May 5, B1, B7 (May 5).

    Munter, H. A. 2015. Importance of audits of internal controls: Public Company Accounting Oversight Board (PCAOB). September 9. Available at http://pcaobus.org/News/Speech/Pages/Munter-Audits-Internal-Control-IAG-09092015.aspx. Accessed March 11, 2016.

    Murphy, M. L., and K. Tysiac. 2015. Data analytics helps auditors gain deep insight. Journal of Accountancy (April/May): 52-58.

    Norton, S. 2015. Big companies rein in data sprawl. The Wall Street Journal, B4 (October 22).

    Popper, N. 2016. The robots are coming for Wall Street. The New York Times, February 25. Available at http://www.nytimes.com/2016/02/28/magazine/the-robots-are-coming-for-wall-street.html?rref=collection%2Fsectioncollection%2Fmagazine&action=click&contentCollection=magazine&region=stream&module=stream_unit&version=latest&contentPlacement=9&pgtype=sectionfront. Accessed March 5, 2016.

    PwC. 2015. Data driven: What students need to succeed in a rapidly changing business world: PricewaterhouseCoopers LLP. Accessed May 31, 2015. http://www.pwc.com/us/en/faculty-resource/assets/PwC-Data-driven-paper-Feb2015.pdf.

    Rudegeair, P. 2015. Online firms seek to eat banks' lunch. The Wall Street Journal, C1 (June 29).

    Titera, W. R. 2013. Updating audit standard--enabling audit data analysis. Journal of Information Systems 27 (1): 325-331.

    Zetlin, M. 2015. Breaking free. CIO (October 1): 22-29.

     

  • Roger S Debreceny
    Forthcoming paper: Valuing Personal Data to Foster Privacy:...
    blog entry posted March 23, 2016 by Roger S Debreceny, tagged research 

    An important new commentary in JIS from Juergen Sidgman and Malcolm Crompton is entitled "Valuing Personal Data to Foster Privacy: A Thought Experiment and Opportunities for Research." Juergen is Assistant Professor at the University of Wisconsin at Oshkosh. Malcolm is Managing Director of Information Integrity Solutions in Australia and Australia's Privacy Commissioner from 1999 to 2004.   

    Malcolm Crompton

    What if accounting standards changed to require corporations to determine and present the value of personal data on financial statements?

    We argue that in this, the information age, markets would be significantly better informed, valuations of corporations would change radically and that board and management focus would be markedly different.

    In a time when data is increasingly becoming the foundation for business success, the race to collect, analyze, and use large amounts of personal data has left largely unattended the privacy considerations of individuals from whom this data is obtained. As a consequence, individuals are left alone to deal with increased risks of identity theft, potential embarrassment and stigma, and potential civil rights violations.

    On the surface, organizations seem to lack real incentives to protect personal data. Individuals continuously provide personal data at no cost to governments and to businesses through interactions with social media, Internet searches, and possession of smart devices and other internet connected equipment. In addition, anecdotal evidence shows that when breaches in corporate security lead to data breaches the reputational damages encountered by the affected corporations have no long-term economic consequences.

    In this commentary, we make the case that absent formal mechanisms enabling corporations to determine and present the value of personal data on financial statements, privacy considerations will continue to be neglected. We acknowledge the difficulty of the task ahead exposing difficult questions such as: how can we assess and reassess the evolving future economic benefits that collected data has the potential to create? In light of these difficulties and the paucity of research in this area, we present research opportunities that are relevant to different fields in accounting.

    We also elaborate on how we expect data valuation to impact privacy through greater understanding of data, improved market efficiency, and more thorough oversight over management’s handling of data assets. Finally, we express that through data valuation, the accounting community has a unique opportunity to limit, not only improper data protection mechanisms that weaken privacy, but also the growing market consensus exposing the diminishing usefulness of financial reports.

    We acknowledge that before any change such as this could be contemplated in practice, significant advances in accounting theory and practice will be essential. We also argue that the extent to which markets are currently ill-informed means that the time to start is now.

  • Roger S Debreceny
    Forthcoming paper: Repairing Organizational Legitimacy...
    blog entry posted March 23, 2016 by Roger S Debreceny, tagged research 

    A forthcoming paper in JIS is "Repairing Organizational Legitimacy Following Information Technology (IT) Material Weaknesses: Executive Turnover, IT Expertise, and IT System Upgrades" authored by Jacob Z. Haislip, Adi Masli, Vernon J. Richardson and Juan Manuel Sanchez.

    Jacob Z. Haislip

    Adi Masli

    Vernon J. Richardson

    Juan Manuel Sanchez

    Information technology (IT) is of first-order importance regarding the financial systems’ ability to provide access to and security of accounting records, which ultimately affects financial reporting quality. COSO’s 2013 framework on internal controls highlights the importance of IT since reliance on evolving and new technologies continues to grow. Specifically, COSO’s 2013 framework lists selecting and developing controls over information technology as one of the pivotal internal control principles. Findings from extant research suggest that, relative to non-IT related internal control material weaknesses, IT-related material weaknesses (or IT material weaknesses) have a more negative impact to the effectiveness of internal controls, result in less reliable financial information, and generate consequences that are more adverse. Hence, the existence of IT material weaknesses presents a threat to the legitimacy of an organization. The objective of our research article is to examine how companies repair organizational legitimacy following information technology material weaknesses. We document that firms with IT material weaknesses, compared to firms with non-IT related material weaknesses, are more likely to replace outgoing CEOs, CFOs, and directors with individuals possessing IT expertise. In addition, IT material weakness firms are more likely to upgrade their financial reporting IT system and make other IT initiative changes, such as hiring an executive dedicated to the oversight and management of IT or adding a technology committee to the board. Moreover, we find evidence suggesting that IT material weakness firms that engage in major changes in their IT governance (i.e., those that replace their CFO with executives with IT expertise and/or upgrade the financial reporting system) are more likely to remediate their internal control weaknesses. Overall, firms recognize that IT control breakdowns represent a legitimate threat to the financial reporting environment and firms engage in various steps to restore organizational legitimacy.

  • Roger S Debreceny
    Forthcoming paper: Blogs as Research and Teaching Resources...2
    blog entry posted March 18, 2016 by Roger S Debreceny, tagged research, teaching 

    A forthcoming paper in JIS from Dr Glen Gray, California State University at Northridge, is entitled  Blogs as Research and Teaching Resources for Accounting Academics. So this is a blog about blogs!

    Glen Gray

    Dr Glen Gray

    If William Shakespeare was writing about blogs today, he would say something like "What’s in a name? A blog by any other name would still be just as sweet." Blogs are definitely a sweet source of timely information on a wide variety of topics for accounting- and AIS-related research and teaching. However, there is no absolute definition of blogs that can be used to determine whether a website is a blog or not. Blog is a label that a web master can give to his/her website or not. As working definition, a blog is a website where an individual (a blogger) or a group of individuals post observations and opinions regarding a specific topic and usually where readers can post comments. There is probably thousands of useful website that meet this definition but are not called blogs by the webmasters. Under this broad definition of blogs, social media (Facebook, Twitter, etc.) are blogs.

    Potentially useful blogs (and blog equivalents) are maintained by academics (inside and outside of the accounting domain), practitioners, large and small accounting firms, and accounting and technology vendors. The primary value of blogs is their immediacy because bloggers are very quick to post blogs regarding any events (new regulations, new lawsuits, new technologies, etc.) in the subject matter they are following. Bloggers archive their previous posts; accordingly, besides being very timely, blogs can also provide a rich history of specific subject matters. However, quoting the movie, How the Grinch Stole Christmas, “One man’s compost is another man’s potpourri.”  Blogs are not vetted or subject to independent review, consequently it’s caveat emptor when venturing into the blogosphere to collect information.  

    With that caution stated, different bloggers’ opinions—biased or not—on a specific topic or subject matter can be a valuable starting point or input for more rigorous research. These opinions can be transformed into testable hypothesis and subsequently accepted or rejected.

    These different bloggers’ opinions can also be a valuable teaching resource. Students can be assigned to locate several blogs on a specific topic, summarize differences found, and develop conclusions based on their review of the blogs and a rational for reaching those conclusions.

    The challenge is that locating applicable blogs is a messy and arbitrary process. Search of BlogSearchEngine.org (a Google app) on the term “accounting” returns 32 million results! Fortunately, the top listed accounting blogs are frequently lists of blogs (e.g. 50 Accounting Blogs You Should Follow at blog.directcapital.com).  In addition to traditional web searches (e.g., google.com), there are specialized blog directories that provide another method to locate blogs. One place to start is 23 Blog Directories to Submit Your Blog To (www.searchenginejournal.com/20-essential-blog-directories-to-submit-your-blog-to).

    To completely realize the value of blogs, you should also search for websites that have all the characteristics of blogs, but are not labeled blogs. For example, in addition to specifically labeled blogs, both large and small accounting firms publish newsletters, alerts, and perspectives that have all of the characteristics blogs. These blog-equivalents cover a wide variety of topics and be included in any search for blogs for research or teaching.

    Recent Comments (2 of 2)

  • Roger S Debreceny
    Forthcoming paper: The Effect of Frequency and Automation of...
    blog entry posted March 18, 2016 by Roger S Debreceny, tagged research 

    A forthcoming paper in JIS authored by Dr Maia Farkas at California State University, San Marcos and Dr Rina Hirsch from Hofstra University is entitled The Effect of Frequency and Automation of Internal Control Testing on External Auditor Reliance on the Internal Audit Function. Here is a summary of the paper.

     

    Maia Farkas

    Dr Maia Farkas

    Rina Hirsch

    Dr Rina Hirsch

     

     

    Over the last two decades, audits of publicly traded companies have become increasingly onerous and costly, in large part due to extensive control testing mandated by the Sarbanes-Oxley Act of 2002. For this reason, increasing audit efficiency has become more and more important. One way to improve audit efficiency is for the external auditor to rely on a strong internal audit function (IAF). Furthermore, management benefits from improved audit efficiency by way of lower audit fees, ensuring compliance with regulations, enhancing risk assessments, and ensuring the adequacy and functioning of internal controls.

     

    Using an experiment with experienced external auditors as participants, we examine a setting in which the external auditor identifies poor work performance by the IAF and management implements an internal control testing remediation strategy that varies on two characteristics: automation and frequency. The level of automation can vary from no automation (entirely manual) to complete automation (entirely information technology based). The frequency of internal control testing can vary from periodic testing to continuous testing. We investigate whether the following remediation strategies are effective in improving external auditors’ perceptions of IAF competence, work performance, and objectivity as well as subsequent reliance decision: automated controls testing conducted on a real-time basis (akin to continuous controls monitoring or CCM), automated controls testing conducted on a weekly basis (akin to Audit Command Language or ACL), and manual controls testing conducted by internal auditors on a weekly basis.

     

    Our results indicate that the frequency characteristic of the remediation strategies indeed affects external auditors’ perceptions of competence, work performance, and objectivity, while the automation characteristic of the internal control testing remediation strategies does not impact these perceptions of the strength of the IAF. Contrary to our expectations, external auditors appear to improve reliance on the IAF when the internal control testing is performed less frequently as opposed to more frequently. We suggest that management may not need to invest in expensive continuous controls monitoring technologies to improve reliance subsequent to a shortcoming in the IAF’s work performance if improved reliance (and hence lower audit fees) are its ultimate goal.

    Interestingly, all three remediation strategies are effective at improving assessments of the IAF’s poor work performance as well as external auditors’ reliance on the internal auditors. Thus, some benefit will accrue to management, regardless of the strategy employed. These results provide useful information to management, enabling them to maximize the benefits associated with having a higher quality IAF. Furthermore, our results are informative to standard setters that are interested in how external auditors assimilate information cues.

  • Roger S Debreceny
    Overview of JIS paper: The effects of information...
    blog entry posted March 9, 2016 by Roger S Debreceny, tagged research 

    A new paper by Andrea Kelton, (Wake Forest University) and Murthy (University of South Florida) is The effects of information disaggregation and financial statement interactivity on judgments and decisions of nonprofessional investors. Andrea has provided this blog post on their paper.

     

    Andrea Kelton

    Andrea Kelton

    Uday Murthy

    Uday Murthy

     

     

    Information technologies enable firms to not only report more frequently, but to also enhance the decision-usefulness of financial information through variations in presentation format. We investigate whether the provision of financial statement interactivity via a web-based drilldown mechanism improves investors’ use of disaggregated financial statement information and, ultimately, their decisions. We suggest that a drilldown mechanism will mitigate the negative effects of information overload caused by disaggregation by allowing users to control their viewing of the disaggregation, focus their attentions on the relevant details, and avoid tendencies towards earnings fixation. However, we expect this load minimizing effect to depend upon the utility (i.e., relevance) of the disaggregated details to the investment task.

                We conduct an experiment with nonprofessional investor participants obtained from Amazon Mechanical Turk to investigate these issues. Participants completed a simple decision case wherein they reviewed either high utility or low utility disaggregated financial statements either with or without the drilldown mechanism. Overall, our results show that participants using the drilldown experienced lower cognitive load and were less susceptible to earnings fixation than those without the drilldown capability. However, when the disaggregated details provided limited new information, the use of the drilldown resulted in higher levels of cognitive load as compared to when the disaggregation provides new information.

                Our results should inform standard setters currently considering enhanced financial statement disaggregation. We provide evidence regarding the conditions when disaggregation is helpful versus harmful to investor decision making and the benefits and costs of financial statement interactivity.

     

     

  • Roger S Debreceny
    Overview of Theme Issue on Enterprise Ontologies
    blog entry posted March 6, 2016 by Roger S Debreceny, tagged research 

    Forthcoming in the Summer 2016 issue of the Journal of Information Systems is a theme issue on Enterprise Ontologies, edited by Guido Geerts from the University of Delaware. This is an overview of the theme issue from Guido. 

    Guido Geerts

    Guido Geerts

    In an environment that is characterized by dramatic increases in the volume and variety of data, tools for integration have become progressively more important. The most common way of addressing interoperability issues is by using ontologies: formal specifications of agreed-upon conceptualizations. Ontologies have also proved to be useful as reference models and for reasoning purposes. For more than three decades now, accounting scholars have conducted research in this area, most of it focusing on the REA enterprise ontology. The latter has proved to be useful in a wide variety of applications, including as a reference model during the development of enterprise software, for reasoning purposes, to improve interoperability in economic commerce, and as a framework for teaching core accounting and business process principles.

     

    The objective of the “theme issue on enterprise ontologies” was to extend research on enterprise ontologies in two ways. First, to present the latest developments in the field. This is done by the first two papers. The Scheller and Hruby paper—Business Processes and Value Delivery Modeling Using Possession, Ownership and Availability (POA) in Enterprises and Business Networks—presents a refinement to the REA enterprise ontology for defining value creation and transfer as flows of possession, ownership, and availability. The POA notation further aligns traditional accounting and REA accounting through intuitive business process descriptions. On the other hand, the Snow and Reck paper—Developing a Government Reporting Taxonomy—uses an empirical approach to create a taxonomy for government reporting. Its main objective is to improve accessibility to and comparison of government data for those who invest in municipal bond markets.

     

    Second, the theme issue also initiates a research stream that aims at a better understanding of the enterprise ontology landscape, similar to efforts being done in other disciplines. While all enterprise ontologies focus on representing “economic phenomena,” there are important differences among them in content, scope, and use. The definition of enterprise ontologies in terms of a common framework—the Ontology and Analysis Framework (ODAF)—results in structured discussions of their strengths, weaknesses, and applicability, and also enables comparative analysis among them (i.e., what are the gaps, overlaps, and synergies?). The third and fourth papers in the theme issue discuss specific enterprise ontologies in terms of ODAF. The de Cesare and Partridge paper—BORO as a Foundation to Enterprise Ontology—presents the Business Object Reference Ontology (BORO) as both a foundational ontology and a reengineering methodology. One of BORO’s characteristics is that it has been used extensively in practice for a wide variety of applications, including the re-engineering of legacy systems, the development of reference architectures for enterprise data exchange, and enterprise systems integration. On the other hand, the paper by Weigand—The e3value Ontology for Value NetworksCurrent States and Future Directions—provides a systematic overview of the e3value ontology and its use for exploring innovative business models from an economic point of view. In addition, it discusses a number of possible extensions, in particular the co-creation of value and value model quality.

     

  • Roger S Debreceny
    Overview of JIS paper: Applying Basic Gamification...
    blog entry posted February 26, 2016 by Roger S Debreceny, tagged research 

    A forthcoming paper in JIS is Ryan J. Baxter, D. Kip Holderness, and David A. Wood Applying Basic Gamification Techniques to IT Compliance Training: Evidence from the Lab and Field. This blog provides an overview of the paper. doi: http://dx.doi.org/10.2308/isys-51341

    Ryan Baxter

    Ryan Baxter

    Kip Holderness

    Kip Holderness

    David Wood

    David Wood

    Companies use internal controls to protect and maintain the integrity of their information systems. However, internal controls are only as effective as the employees who operate them. Consequently, companies devote valuable resources to train employees on their responsibilities to safeguard company information. Most employees dislike compliance training and find the experience boring, which can lead to ineffective training.

    In an effort to improve the efficacy of training, some companies have begun incorporating basic elements of gaming into their training modules – a practice known as “gamification.” Our study makes use of a laboratory experiment using student participants and a field study using employees at a large multi-national bank to examine whether gamified training results in greater enjoyment and effectiveness than traditional, non-gamified training.

    Our participants report that gamified training is more enjoyable and interesting, and less boring than traditional, non-gamified training modalities. In addition, participants who completed the gamified training scored higher on information security knowledge assessments that those who received no training, though they did not score higher than those who received comparable non-gamified training.

    We also find that individual gaming preferences influence the effectiveness of gamified training. Specifically, we find that gamified training results in greater knowledge acquisition for “gamers,” those who participate in gaming on their own time, relative to “non-gamers.” This result was somewhat surprising, given that gamers were less impressed with gamified training than non-gamers. Our results suggest that companies need to understand the preferences of their employees when deciding on what types of training to implement.

    In summary, though gamification does not appear to be the silver bullet needed to increase both enjoyment and learning outright, it may reduce the apathy with which employees approach training, and our results suggest that it does not hinder learning. We believe that future research in this area will guide practitioners on matching the right gamification mechanics with organizational needs.

     

  • Roger S Debreceny
    Overview of JIS paper: SECURQUAL: An Instrument for...
    blog entry posted February 25, 2016 by Roger S Debreceny, tagged research 

    A forthcoming paper from Paul John Steinbart, Robyn L. Raschke, Graham Gal, and William N. Dilla is entitled  SECURQUAL: An Instrument for Evaluating the Effectiveness of Enterprise Information Security Programs. doi: http://dx.doi.org/10.2308/isys-51257

    Paul John Steinbart

    Robyn L. Raschke

    Graham Gal

    William N. Dilla

    Research on information security has been hampered by the scarcity of objective data concerning the effectiveness of organizations’ information security efforts. This study develops a multi-dimensional instrument based on the COBIT v4.1 Maturity Model rubrics. With the cooperation and support of the IMTA section of the AICPA, we collected four security outcome measures from 71 companies: number of noncompliance with security policy issues serious enough to be brought to the attention of the Board of Directors, number of security-related internal control weaknesses reported to the Board, number of attacks capable of causing serious harm that were detected and stopped before causing harm, and the number of attacks that did cause serious harm. We demonstrate that the instrument, SECURQUAL, is a reliable surrogate for measuring the effectiveness of an organization’s information security program.

    One desirable feature of SECURQUAL is its parsimony. It contains questions about only 18 of COBIT v4.1 Maturity Model rubrics. Further, the instrument uses only one Likert-type question with a five-point response scale to measure each of those topics. Thus, it should be a useful tool for both researchers and practitioners to assess the overall effectiveness of an organization’s information security.