AFM 241: Impact of Technology on Business
Malik Datardina
Estimated study time: 1 hr 14 min
Table of contents
Sources and References
Primary readings — Christensen, Clayton M., Michael E. Raynor, and Rory McDonald. “What Is Disruptive Innovation?” Harvard Business Review, Dec. 2015; Christensen, Clayton M., Stephen P. Kaufman, and Willy C. Shih. “Innovation Killers: How Financial Tools Destroy Your Capacity to Do New Things.” Harvard Business Review, Jan. 2008; Wessel, Maxwell, and Clayton M. Christensen. “Surviving Disruption.” Harvard Business Review, Dec. 2012. Supplementary — Blosch, Marcus, and Jackie Fenn. “Understanding Gartner’s Hype Cycles.” Gartner, Inc., 2018; Blackburn, Simon, et al. “Strategy for a Digital World.” McKinsey & Company, Oct. 2021; Datardina, Malik. “Generative AI in Accounting and Finance: A Framework for Workplace Efficiency.” April 2025. Standards and guidance — CPA Canada. “Audit Considerations Related to Cryptocurrency Assets and Transactions,” 2018; McGrath, Amanda, and Alexandra Jonker. “AI Compliance: What It Is, Why It Matters and How to Get Started.” IBM, Oct. 2024; EU Artificial Intelligence Act (Regulation (EU) 2024/1689).
Chapter 1: Technology Strategy and Business Disruption
Technology as a Business Phenomenon
Technology is often discussed as if it were primarily a technical matter — the domain of engineers and computer scientists. But every major technological shift in history has been, at its core, a business phenomenon. The steam engine, the assembly line, the internet, and generative AI all reshaped industries not because of what they could do technically, but because of how they changed the economics of production, distribution, and competition.
This course approaches technology from a business and strategic perspective: what does a new technology mean for competitive dynamics, for financial performance, for organizational structure, and for the accounting and finance profession?
The core argument is that business acumen — not technical skill — is the pivotal resource for enabling new technologies to cross from the laboratory to mainstream adoption. Organizations that understand how to evaluate, time, and implement technology investments create durable competitive advantage. Those that react too slowly get disrupted; those that invest too early destroy capital.
Digital Business Strategy
A digital business strategy is not simply a technology plan — it is a business strategy that is enabled and sometimes fundamentally reshaped by digital capabilities. McKinsey’s Strategy for a Digital World (Blackburn et al., 2021) argues that digital strategy requires familiar strategic disciplines — positioning, scale, and differentiation — applied in new ways:
- Faster clock speed: Digital competition moves faster than traditional competition; strategy cycles must compress
- New sources of scale: Data and network effects create scale advantages that are different in character from traditional manufacturing scale
- Ecosystem thinking: Platforms create multi-sided markets where the competitive unit is often the ecosystem, not the individual firm
The Three Digital Strategy Imperatives
McKinsey identifies three imperatives that define digitally mature organizations:
- Portfolio boldness: Digitally mature firms actively reallocate capital toward digital capabilities rather than defending legacy positions
- Talent and capability building: Technical fluency throughout the leadership team, not only in a separate “digital” function
- Operating model agility: Structures, processes, and governance that can absorb rapid change — iterative delivery, cross-functional teams, continuous experimentation
Chapter 2: Disruptive Innovation Theory
The Classic Disruption Model
Clayton Christensen’s theory of disruptive innovation (developed in The Innovator’s Dilemma, 1997) is one of the most influential and most frequently misunderstood frameworks in business strategy.
The mechanism of disruption works as follows:
- Established firms serve their most profitable (and most demanding) customers with increasingly sophisticated products. Their resource allocation processes and incentive structures push them to over-serve the top of the market.
- Disruptive entrants begin at the low end — serving customers that incumbents have dismissed as unprofitable — or in a new market context entirely (non-consumers). The initial product is inferior to the incumbent’s offering on the metrics that established customers value.
- Performance trajectory asymmetry: The disruptor improves its product rapidly. The incumbent does not respond because the disruption looks unattractive from a financial perspective — low margins, small market, poor customers.
- Market capture: Eventually, the disruptor’s product is “good enough” for mainstream customers, and it attacks the incumbent’s core business from below.
Sustaining vs. Disruptive Innovation
It is critical to distinguish disruption from other forms of innovation:
| Type | Description | Example |
|---|---|---|
| Sustaining innovation | Improves existing products for existing customers; incumbents usually win | Each new generation of iPhone (for Apple’s existing customers) |
| Low-end disruption | Targets over-served customers with a simpler, cheaper offering | Southwest Airlines (discount air travel) |
| New-market disruption | Targets non-consumers with a more accessible product | Personal computers (vs. mainframes that only businesses could afford) |
Innovation Killers — Financial Tools that Destroy Innovation
Christensen, Kaufman, and Shih (2008) argue that standard financial analytical tools — particularly discounted cash flow (DCF) and net present value (NPV) analysis — systematically bias large organizations away from disruptive investment. The mechanisms include:
The denominator problem: DCF calculates the present value of future cash flows, but the denominator (the discount rate) treats all uncertainty as equivalent. Incremental improvements to existing products have more predictable cash flows than disruptive investments, so they always appear more attractive in a DCF model.
Treating fixed costs as sunk: When evaluating a disruptive investment, financial analysts correctly treat sunk costs as irrelevant to the forward-looking decision. But this means the incumbent compares the disruptor’s full cost structure (all assets must be purchased) against its incremental cost (existing assets already paid for). The incumbent looks like it has an advantage, even when the disruptor’s long-run economics are better.
The earnings-per-share fixation: Short-term EPS pressure discourages investment in innovations that require years to generate returns, even when NPV is strongly positive.
The implication for financial professionals: when evaluating technology investment decisions, these biases must be explicitly recognized and adjusted for.
Chapter 3: The Gartner Hype Cycle
Technology Adoption and Irrational Expectations
When a new technology emerges, market enthusiasm typically runs far ahead of practical utility. Investment pours in, valuations balloon, and pundits declare the end of entire industries. Then reality sets in: the technology turns out to be harder to implement and less transformative (in the short run) than expected. A crash follows. Eventually, after expectations are recalibrated, the technology delivers genuine and lasting value — often reshaping an industry in ways that the original hype, ironically, had roughly predicted.
This pattern repeats with remarkable regularity. The Gartner Hype Cycle (Blosch and Fenn, 2018) provides a framework for understanding and navigating it.
The Five Phases of the Hype Cycle
The five phases are:
Innovation Trigger: A technological breakthrough — a proof-of-concept, a research announcement, a product launch — generates significant media coverage. No usable products exist yet; commercial viability is unproven.
Peak of Inflated Expectations: Early publicity produces a wave of enthusiasm. Some early adopters succeed; many more fail. The technology is expected to revolutionize everything, everywhere, immediately.
Trough of Disillusionment: Interest wanes as implementations and products fail to deliver on inflated expectations. Producers of the technology shake out; only those that improve their products to the satisfaction of early adopters survive.
Slope of Enlightenment: More instances of how the technology can benefit the enterprise emerge. Second- and third-generation products appear. Methodologies for implementation develop. More enterprises fund pilots, though conservative companies remain cautious.
Plateau of Productivity: Mainstream adoption begins. The criteria for assessing provider viability are more clearly defined. The technology is broadly applicable and scalable.
Strategic Implications for Technology Investment Timing
The Hype Cycle has direct implications for technology investment timing:
- Investing at the Peak: High cost, high risk of failure; you pay for hype, not demonstrated value. Only appropriate for organizations seeking first-mover advantage in genuinely high-stakes competitive environments.
- Investing in the Trough: Higher probability of success (the technology works for those who survived), lower cost (valuation multiples compressed), but requires patience and the willingness to absorb continued uncertainty.
- Investing at the Plateau: Low risk; reliable ROI. But competitive differentiation from the technology is minimal — everyone adopts at roughly the same time.
Chapter 4: Financial Metrics and Technological Disruption
How Financial Metrics Drive Disruption Outcomes
The financial structure of an incumbent and an entrant plays a critical role in determining who wins a disruptive battle. Two metrics are especially important: gross margin and discounted cash flow.
Gross Margin as a Disruption Signal
Gross margin tells the analyst how much of each revenue dollar is available after variable production costs. High gross margins make a business attractive for disruption: the incumbent earns substantial profits on existing customers, which it will not sacrifice by matching a low-cost competitor’s price. The disruptor, operating at lower margins, still earns enough to survive and grow.
\[ \text{Gross Margin} = \frac{\text{Revenue} - \text{COGS}}{\text{Revenue}} \]Software businesses often operate with gross margins of 70–80%, making them particularly vulnerable to disruption by competitors who can deliver equivalent functionality at dramatically lower marginal cost (since software’s marginal cost of reproduction is near zero).
Discounted Cash Flow and Investment Bias
As discussed in Chapter 2, DCF analysis can systematically undervalue disruptive investments because:
- The cash flows from a disruptive innovation are highly uncertain and long-dated
- The discount rate applied reflects the volatility of these cash flows
- The cash flows from sustaining innovation to existing customers are less uncertain
The result: when a financial analyst compares a disruption project against an incremental improvement project using NPV, the incremental project nearly always wins. This explains why disruption so often comes from outside the incumbent — the incumbent’s own financial processes kill the disruptive idea before it reaches market.
Surviving Disruption — The Incumbent’s Response
Wessel and Christensen’s “Surviving Disruption” (2012) offers guidance for incumbents facing disruption:
Identify the disruptor correctly: Not every new entrant is a disruptor. A competitor targeting the same demanding customers with a better product is a sustaining threat (manageable through traditional competitive responses). Only low-end or new-market entrants following the disruption trajectory are true disruptors.
Assess the pace of disruption: How quickly is the disruptor’s performance improving relative to customers’ requirements? If the gap is closing fast, the incumbent must act urgently. If slowly, it has time to adapt.
Create a disruptive response: The incumbent must be willing to cannibalize its own business by creating a separate unit that competes with the low-end offering — even at the cost of lower margins in the short run. Clayton Christensen calls these “disruption-proof” units.
Chapter 5: Digital Transformation Fundamentals
What Is Digital Transformation?
Digital transformation is not merely buying new software or moving data to the cloud. It is a fundamental rethinking of how work is done and how value is created. It demands changes to:
- Processes: Replacing paper-based or manual workflows with automated, data-driven equivalents
- Culture: Embedding data literacy, experimentation, and continuous learning throughout the organization
- Structure: Breaking down departmental silos so that data flows freely across functions
- Business model: In some cases, entirely new revenue streams and customer relationships emerge from digital capability
The Three Layers of Digital Change
Practitioners often describe digital transformation in three overlapping layers:
| Layer | Description | Accounting/Finance Example |
|---|---|---|
| Digitization | Converting analog information to digital format | Scanning paper invoices to PDF |
| Digitalization | Using digital data to improve existing processes | Using scanned invoice data to automate matching in AP |
| Digital transformation | Redesigning the business model around digital capability | Real-time treasury visibility; continuous audit |
Cloud Computing
Cloud computing is the delivery of computing services — servers, storage, databases, networking, software, analytics, and intelligence — over the internet (“the cloud”) to offer faster innovation, flexible resources, and economies of scale. Organizations pay only for the cloud services they use, helping them lower operating costs and scale as business needs change.
Service Models: IaaS, PaaS, SaaS
The cloud computing industry organizes services into three primary delivery models:
The “pizza as a service” analogy is often used to illustrate these distinctions:
| Scenario | You manage | Provider manages |
|---|---|---|
| On-premises | Everything (servers, OS, middleware, application, data) | Nothing |
| IaaS | OS, middleware, application, data | Hardware, networking, virtualization |
| PaaS | Application, data | Everything else |
| SaaS | Data and configuration | Everything else |
Deployment Models: Public, Private, Hybrid
Beyond service models, cloud deployments differ in who controls and accesses the infrastructure:
- Public cloud: Infrastructure owned and operated by a third-party provider; shared among many customers. Examples: AWS, Azure, GCP. Lowest cost; highest flexibility; data sovereignty and security concerns for regulated industries.
- Private cloud: Infrastructure dedicated to a single organization, either on-premises or hosted by a provider. Higher cost; greater control; suitable for entities with strict regulatory or confidentiality requirements (e.g., financial institutions handling client data).
- Hybrid cloud: A combination of public and private cloud, with orchestration between them. Allows organizations to run sensitive workloads privately while bursting to public cloud for peak demand.
Financial Implications of Cloud Adoption
Cloud computing fundamentally changes the capital structure of technology investment:
- CapEx to OpEx shift: On-premises infrastructure is a capital expenditure — it appears on the balance sheet, is depreciated over several years, and requires large upfront commitment. Cloud services are operating expenditures — expensed as incurred, matching cost to consumption. This shift improves cash flow predictability and reduces technology risk.
- Elastic scalability: Cloud resources scale up and down with demand. A retailer can provision additional compute capacity for the holiday season and release it in January — paying only for what is used.
- Total cost of ownership (TCO): Although per-unit cloud costs are often higher than equivalent owned infrastructure, the TCO calculation must include on-premises costs such as data center space, power, cooling, hardware maintenance, and IT staff. Cloud often wins the TCO comparison, especially for variable workloads.
APIs and Microservices
Application Programming Interfaces (APIs)
APIs are the connective tissue of the modern digital economy. When a business’s accounting system automatically retrieves real-time foreign exchange rates, or when a payroll system pushes salary data directly to the general ledger, APIs are what make those connections possible.
REST APIs (Representational State Transfer) are the dominant paradigm for web-based APIs. They use standard HTTP methods:
GET— retrieve dataPOST— create new dataPUT/PATCH— update existing dataDELETE— remove data
Open Banking is a regulatory and commercial movement that uses APIs to allow third-party financial applications to access bank account data (with customer consent). In Canada, open banking is being implemented by financial institutions under FCAC guidance, enabling accounting software (e.g., QuickBooks, Wave) to retrieve bank transactions directly for automated reconciliation.
Microservices Architecture
Traditional “monolithic” applications bundle all functionality together. This makes them easy to develop initially but difficult to scale and update. Microservices architecture decomposes a large application into dozens or hundreds of small services:
| Monolithic Architecture | Microservices Architecture |
|---|---|
| Single deployable unit | Many independently deployable services |
| Scaling requires scaling the entire application | Individual services scale independently |
| One technology stack for the entire application | Each service can use the most appropriate technology |
| A bug in one area can crash the whole system | Failures are isolated to individual services |
| Large, infrequent releases | Continuous deployment of individual services |
Chapter 6: Enterprise Resource Planning Systems
What Is an ERP?
Before ERP systems, organizations operated with separate, siloed systems for each function: one system for accounting, another for inventory, another for payroll. Information flowed between them through manual data entry — a process that was slow, error-prone, and produced inconsistent data. ERP systems solve this by providing a single integrated platform where a transaction entered in one module (e.g., a purchase order in procurement) automatically updates related modules (e.g., accounts payable, inventory, general ledger).
The dominant ERP vendors are:
- SAP (Systems, Applications, and Products): German multinational; the largest ERP vendor globally. SAP S/4HANA is its current flagship product, built on an in-memory database (SAP HANA). Used by most Fortune 500 companies.
- Oracle: Oracle ERP Cloud (formerly Oracle Financials Cloud) is SAP’s primary enterprise competitor. Oracle also acquired NetSuite, the leading cloud ERP for mid-market companies.
- Microsoft Dynamics 365: Microsoft’s ERP and CRM platform, tightly integrated with Microsoft 365 (Office, Teams, Power BI). Strong in mid-market.
- Workday: Cloud-native ERP focused on finance and HR; popular in large professional services and technology firms.
Core ERP Modules
Modern ERP systems are organized into functional modules, each managing a specific business domain:
Financial Accounting (FI)
The financial accounting module records all financial transactions and produces the statutory financial statements. Key functions:
- General ledger: The master record of all financial transactions; the foundation of the chart of accounts
- Accounts receivable: Customer invoicing, payment receipt, dunning (overdue invoice follow-up), credit management
- Accounts payable: Vendor invoice processing, three-way matching (purchase order, goods receipt, vendor invoice), payment runs
- Asset accounting: Fixed asset records, depreciation calculation, asset disposals
- Bank accounting: Bank reconciliation, electronic bank statement processing, cash position management
Controlling (CO)
The controlling module provides internal management accounting — the information that managers need to plan and control the business. It includes:
- Cost center accounting: Tracking costs by organizational unit (department, division)
- Profit center accounting: Tracking revenues and costs by business segment
- Product costing: Calculating the standard cost of manufactured goods
- Profitability analysis (CO-PA): Multi-dimensional analysis of profitability by customer, product, region, channel
Materials Management (MM)
Manages the procurement of materials and the management of inventory:
- Purchase requisitions, purchase orders, goods receipts
- Inventory valuation (FIFO, moving average)
- Vendor evaluation and supplier management
Sales and Distribution (SD)
Manages the order-to-cash cycle:
- Customer orders, delivery, shipping, billing
- Pricing conditions, rebates, promotions
- Integration with accounts receivable for posting of customer invoices
Human Capital Management (HCM)
Manages the employee lifecycle from hiring through retirement:
- Personnel administration, organizational management
- Payroll processing (with tax and deduction calculations)
- Time management, leave tracking
ERP Implementation: Challenges and Change Management
ERP implementations are among the most complex and costly IT projects an organization can undertake. They routinely exceed budgets, extend timelines, and fall short of expected benefits. Understanding why requires understanding both the technical and organizational dimensions.
The Implementation Lifecycle
A typical ERP implementation follows a structured methodology (SAP uses “ACTIVATE”; Oracle uses “Unified Methodology”):
- Prepare: Define project scope, form the project team, establish governance structures, conduct initial system configuration
- Explore: Map current business processes (“as-is”), design future-state processes (“to-be”), identify gaps between standard ERP functionality and business requirements
- Realize: Configure the system to match the to-be design; develop custom code for gaps (RICEF: Reports, Interfaces, Conversions, Enhancements, Forms); conduct unit testing
- Deploy: System integration testing, user acceptance testing (UAT), data migration, end-user training, cutover planning
- Run: Go-live, hypercare support, knowledge transfer to internal team, ongoing optimization
Why ERP Implementations Fail
- Scope creep: Stakeholders continuously add requirements, expanding the project beyond its original boundaries
- Insufficient executive sponsorship: Without active, visible support from senior leadership, the project cannot overcome organizational resistance
- Underestimating change management: A new ERP changes *how people work*; without structured change management, adoption fails
- Poor data quality: Migrating dirty data from legacy systems into the new ERP propagates errors and undermines trust in the new system
- Excessive customization: Customizing the ERP to match legacy processes defeats the purpose of implementing a best-practice system and creates a costly, fragile technical estate
Change Management in ERP Projects
The Prosci ADKAR model provides a framework for individual-level change management in technology implementations:
- Awareness: Employees understand why the change is needed
- Desire: Employees want to participate and support the change
- Knowledge: Employees know how to change (training)
- Ability: Employees demonstrate the skills and behaviours required by the new system
- Reinforcement: Changes are sustained through recognition, accountability, and feedback
ERP and Internal Controls
One of the most significant benefits of a well-implemented ERP system is the embedded system of internal controls. Because all transactions flow through a single integrated platform:
- Segregation of duties (SoD): The ERP can enforce that the same user cannot both create a vendor and approve payments to that vendor. Role-based access controls prevent incompatible function combinations.
- Automated approval workflows: Purchase orders above a threshold automatically route to the appropriate approver; the system enforces the delegation of authority matrix.
- Audit trail: Every transaction is time-stamped and linked to the user who posted it. The complete history of a document — creation, approval, posting, reversal — is preserved.
- Period-end controls: The system can prevent posting to closed periods, ensuring that financial statements are not retroactively altered.
For external auditors, the presence of a well-configured ERP allows a controls reliance approach: if IT general controls and ERP application controls are effective, the auditor can reduce substantive testing. This is the basis for IT audit work in integrated audit engagements.
Chapter 7: Data Analytics and Business Intelligence
The Data Analytics Spectrum
Business intelligence and analytics can be organized along a spectrum of increasing analytical sophistication:
| Type | Question Answered | Example | Tools |
|---|---|---|---|
| Descriptive analytics | What happened? | Revenue by region last quarter | Excel, Power BI, Tableau |
| Diagnostic analytics | Why did it happen? | Which customer segments drove the revenue decline? | Drill-down BI tools, SQL |
| Predictive analytics | What will happen? | Which customers are at risk of churn next quarter? | Regression, machine learning |
| Prescriptive analytics | What should we do? | What price maximizes expected revenue given demand elasticity? | Optimization, simulation |
Most organizations today have strong descriptive analytics capabilities but limited predictive and prescriptive analytics. Closing this gap is a major source of competitive advantage.
Structured Query Language (SQL)
SQL is the standard language for interacting with relational databases — the dominant data storage paradigm for business data. Every accounting system, ERP, and CRM stores its data in a relational database. Accountants who understand SQL can query this data directly, without waiting for IT to build reports.
Core SQL Syntax
The fundamental SQL SELECT statement has the following structure:
SELECT column1, column2, aggregate_function(column3)
FROM table_name
WHERE filter_condition
GROUP BY column1, column2
HAVING aggregate_filter_condition
ORDER BY column1 ASC;
Key clauses explained:
SELECT: Specifies which columns to returnFROM: Specifies the table (or tables, joined together) to queryWHERE: Filters rows before any aggregation (applied to individual rows)GROUP BY: Aggregates rows sharing the same values in specified columnsHAVING: Filters groups after aggregation (applied to aggregated results)ORDER BY: Sorts the result set
SQL Joins
Relational databases store data in multiple related tables. Combining tables requires a JOIN operation:
SELECT i.invoice_id, i.vendor_id, i.amount, i.posted_by, i.post_date,
v.vendor_name, v.created_by, v.created_date
FROM invoices i
INNER JOIN vendors v ON i.vendor_id = v.vendor_id
WHERE i.posted_by = v.created_by
AND i.post_date >= '2024-01-01'
ORDER BY i.amount DESC;
This query returns all invoices where the person who posted the invoice is the same person who created the vendor — a potential indicator of fraudulent vendor creation and self-authorization.
ETL: Extract, Transform, Load
Before data can be analyzed, it must typically be gathered from multiple source systems, cleaned, standardized, and loaded into an analytical data store. This process is called ETL:
- Extracted from one or more source systems (ERP, CRM, spreadsheets, APIs)
- Transformed — cleaned, standardized, deduped, and formatted for analysis
- Loaded into a target data store (data warehouse, data mart, analytical database)
The transformation step is typically the most labour-intensive. Common transformation tasks include:
- Data cleansing: Correcting invalid values (negative quantities, future dates on historical transactions), handling null values, standardizing formats (all dates to YYYY-MM-DD, all currency amounts to the same functional currency)
- Deduplication: Identifying and removing duplicate records (e.g., the same customer appearing under slightly different names)
- Enrichment: Adding data from external sources (e.g., adding the exchange rate to convert foreign currency transactions)
- Aggregation: Pre-computing summary statistics for fast dashboard rendering
Modern ETL has evolved into ELT (Extract, Load, Transform) — loading raw data into a cloud data warehouse first and then performing transformations using the warehouse’s compute power. This approach is enabled by the cheap storage and elastic compute of cloud platforms (Snowflake, Google BigQuery, Amazon Redshift).
Data Warehouses and Data Lakes
| Dimension | Data Warehouse | Data Lake |
|---|---|---|
| Data types | Structured (tables) | Structured, semi-structured, unstructured |
| Schema | Schema on write (defined before loading) | Schema on read (defined at query time) |
| Query performance | High (pre-aggregated, indexed) | Variable (raw data must be processed) |
| Use cases | Operational reporting, dashboards | ML model training, exploratory analytics |
| Users | Business analysts, finance teams | Data scientists, ML engineers |
Many modern organizations maintain both: a data lake for raw data storage and ML experimentation, and a data warehouse (or “lakehouse”) for governed, production-quality analytics consumed by finance teams.
Business Intelligence and Visualization Tools
Business Intelligence (BI) tools allow non-technical users to build reports, dashboards, and visualizations by connecting to data sources — without writing code.
Tableau
Tableau is a leading data visualization platform known for its drag-and-drop interface and the quality of its visualizations. Key concepts:
- Data source: A connection to a database, Excel file, or cloud service
- Dimensions: Categorical fields (product name, region, customer segment) used to slice data
- Measures: Quantitative fields (revenue, quantity, margin) that are aggregated
- Marks: Visual encodings — color, size, shape, label — used to represent data
- Dashboards: Collections of multiple visualizations arranged for at-a-glance insight
Microsoft Power BI
Power BI is Microsoft’s BI platform, tightly integrated with Excel, Azure, and Microsoft 365. It consists of three components:
- Power BI Desktop: A Windows application for building reports and data models
- Power BI Service: A cloud-based platform for publishing, sharing, and collaborating on reports
- Power BI Mobile: Mobile apps for consuming reports on smartphones and tablets
Power BI uses DAX (Data Analysis Expressions) for calculated columns and measures — a formula language similar to Excel. It also integrates with Power Query (also called M language) for ETL transformations.
Descriptive vs. Diagnostic Analytics in Detail
Descriptive Analytics
Descriptive analytics answers “what happened?” — it summarizes historical data to provide a factual basis for decisions. Common outputs:
- Financial statements: Income statement, balance sheet, cash flow statement — the original business intelligence
- Management reports: Budget vs. actual, KPI scorecards, trend reports
- Exception reports: Highlighting transactions or conditions that fall outside defined parameters (invoices over a threshold, expense reports with unusual merchant categories)
Diagnostic Analytics
Diagnostic analytics answers “why did it happen?” — it goes beyond reporting to identify root causes. Techniques include:
- Drill-down analysis: Starting from a high-level anomaly (e.g., revenue declined 8% YoY) and progressively disaggregating (by region, by product, by customer) until the root cause is isolated
- Correlation analysis: Identifying statistical relationships between variables (e.g., customer acquisition cost and lifetime value)
- Cohort analysis: Comparing the behavior of groups of customers acquired in different periods
- Variance analysis: Decomposing financial variances into volume effects, price effects, and mix effects
Chapter 8: Robotic Process Automation
What is RPA?
Robotic Process Automation (RPA) refers to software tools that automate repetitive, rule-based business processes by interacting with digital systems in the same way a human user would: clicking buttons, reading and writing data, copying information between applications, and executing structured workflows.
Unlike traditional automation that requires changing underlying software or databases, RPA operates at the presentation layer — it uses the same screens, forms, and workflows that a human employee would use. This makes it faster and cheaper to deploy than traditional IT projects.
RPA in Accounting and Finance
RPA is particularly well-suited to accounting and finance processes because they are:
- High volume: Hundreds or thousands of transactions per day
- Rule-based: The process follows defined logic (if invoice amount matches PO, approve)
- Structured data: Inputs are predictable in format (invoices, bank statements, journal entries)
Common RPA applications in finance:
- Accounts payable: Extracting invoice data, matching to purchase orders, posting to ERP, initiating payment approval workflows
- Bank reconciliation: Pulling bank statements, matching to general ledger entries, flagging unmatched items for human review
- Financial reporting: Consolidating data from multiple systems into standardized report templates
- Month-end close: Executing journal entry postings, running standard reports, populating financial models with current data
- Tax compliance: Pulling transaction data to populate tax return schedules
- Intercompany eliminations: Identifying and eliminating intercompany balances across subsidiaries during consolidation
Attended vs. Unattended RPA
| Type | Description | Trigger | Use Case |
|---|---|---|---|
| Unattended RPA | Runs automatically without human intervention, typically on a server | Scheduled or event-triggered | Overnight bank reconciliation, scheduled report generation |
| Attended RPA | Runs on a user’s desktop, triggered by the user, to assist with tasks the user is performing | User action | Helping a call center agent populate a CRM while speaking with a customer |
| Hybrid RPA | Combines both attended and unattended automation | Mixed | Complex processes where some steps require human judgment and others are fully automatable |
Microsoft Power Automate
Power Automate (formerly Microsoft Flow) is a cloud-based RPA and workflow automation platform available through Microsoft 365. It enables users to build automation workflows (called “flows”) using a low-code/no-code interface.
Key features:
- Automated flows: Triggered by events (e.g., new email, new file in SharePoint)
- Desktop flows (Power Automate Desktop): Records and replays interactions with desktop applications — the core RPA capability
- AI Builder integration: Adds pre-built AI models for document processing, image recognition, and prediction
The Forrester Consulting analysis of Microsoft Power Automate (Dunham, 2024) and the Cineplex case (Microsoft, 2024) both demonstrate significant productivity gains from enterprise RPA deployment — Cineplex saved 30,000 hours per year by automating reporting and operational workflows.
The Economics of RPA
The business case for RPA centers on labor cost savings, error reduction, and throughput improvement:
\[ \text{Annual RPA Savings} = (\text{Hours Automated per Year}) \times (\text{Fully Loaded Hourly Labor Cost}) - \text{Annual RPA Cost} \]However, the full economic analysis requires considering:
- Implementation cost: Design, build, test, and deployment of bots
- Maintenance cost: Bots break when the applications they interact with are updated; ongoing maintenance is significant
- Exception handling: RPA handles the standard case well but cannot handle exceptions that require judgment; humans must still manage exceptions
- Change management: Employees whose tasks are automated must be retrained or redeployed; resistance to change is a common implementation challenge
RPA and the Future of Accounting Work
RPA does not eliminate accounting jobs — it eliminates accounting tasks. A study by McKinsey Global Institute found that while approximately 60% of all occupations have at least 30% of their activities that could be automated with current technology, very few occupations are 100% automatable. For accountants, the tasks at highest risk of RPA are:
- Data entry and transaction coding
- Routine report generation
- Standard reconciliations (bank, intercompany, vendor statement)
- Tax form population from structured data
The tasks that remain human — and become more important as routine work is automated — are:
- Professional judgment in ambiguous situations
- Client and stakeholder communication
- Designing and overseeing automated workflows
- Investigating exceptions flagged by automated systems
- Strategic analysis and decision support
Chapter 9: Generative AI in Business
What is Generative AI?
Generative AI refers to machine learning models capable of generating new content — text, images, code, audio, video — based on patterns learned from large training datasets. The dominant paradigm is the Large Language Model (LLM), a neural network trained on vast quantities of text to predict the next token in a sequence. Models such as OpenAI’s GPT-4 and Anthropic’s Claude are examples.
How LLMs Work: A Conceptual Overview
Understanding generative AI does not require deep mathematical knowledge, but a conceptual grasp of how these models are built helps in evaluating their capabilities and limitations.
Pre-training: The model is trained on enormous quantities of text — web pages, books, code, scientific papers — to predict the next word (token) in a sequence. This process requires massive computational resources (thousands of GPUs for months). The result is a “foundation model” that encodes vast general-purpose language understanding.
Fine-tuning and Reinforcement Learning from Human Feedback (RLHF): The foundation model is further trained on human-curated examples and feedback to align it with human preferences — to be helpful, honest, and to avoid generating harmful content.
Inference: The trained model is deployed and responds to user prompts by generating tokens one at a time, each conditioned on all previous tokens in the context window.
The context window — the amount of text the model can “see” at once — is a critical architectural limit. Early LLMs had context windows of 4,000 tokens (roughly 3,000 words); modern models support 128,000 tokens or more, allowing them to process entire legal contracts or financial reports in a single query.
Generative AI in Accounting and Finance
Datardina (2025) proposes a framework for understanding how generative AI creates value in accounting and finance contexts:
- Summarization and synthesis: Condensing long documents (contracts, financial reports, regulatory filings) into actionable summaries
- Draft generation: Creating first drafts of reports, memos, and communications that human professionals then review and refine
- Code generation: Writing Python, SQL, or VBA code for data analysis, financial modeling, and automation — dramatically lowering the barrier to programmatic analysis
- Question-answering over documents: Querying large document sets (CRA guidance, IFRS standards, court decisions) to find relevant passages without manual search
- Data transformation: Converting unstructured data (text invoices, PDF statements) into structured formats for analysis
The “Cheaper, Better, Faster” Framework
For a given task, generative AI might offer:
- Cheaper: A 20-hour research project completed in 30 minutes of human-AI collaboration
- Better: Consistency in document review, no fatigue-related errors
- Faster: Near-instantaneous drafting, allowing more iteration cycles
However, AI also introduces new risks:
- Hallucinations: Confident generation of false facts — particularly dangerous in legal or financial contexts where accuracy is paramount
- Bias: Training data biases propagated to outputs (e.g., biased credit decisions from a model trained on historically biased lending data)
- Copyright and IP concerns: Generated content may incorporate patterns from copyrighted training data
- Data security: Confidential client information entered into public AI systems may be used to train future models
Text-to-Code Tools
Tools like Bolt.new and GitHub Copilot allow users to describe what they want in natural language and receive working code in return. For accounting and finance professionals:
- A student can ask for a Python script to extract data from a PDF and compute financial ratios
- An analyst can describe a financial model in words and receive a working Excel formula or Python function
- An auditor can generate SQL queries to test database controls without deep SQL expertise
The strategic implication: technical barriers to data analysis are falling rapidly. Financial professionals who combine domain knowledge with basic data literacy will have significant advantages.
Prompt Engineering for Accountants
The quality of AI output depends heavily on the quality of the input prompt. Prompt engineering is the practice of designing inputs to elicit optimal outputs from AI systems.
Key prompt engineering techniques:
- Role specification: “You are a CPA specializing in IFRS. Review the following disclosure and identify any non-compliance issues.”
- Few-shot examples: Providing examples of the desired input-output pairs before the main task
- Chain of thought: Asking the model to “think step by step” to improve reasoning quality on complex tasks
- Constraints: Specifying format, length, audience, and style: “Provide a bullet-point summary for a CFO audience, maximum 200 words.”
Chapter 10: Artificial Intelligence and Machine Learning in Accounting
Machine Learning Fundamentals
Machine learning (ML) is the field of AI concerned with building systems that learn from data, identify patterns, and make decisions with minimal human intervention. Unlike rule-based systems (where human experts encode the logic), ML systems learn the logic from examples.
Types of Machine Learning
| Type | Description | Accounting/Finance Application |
|---|---|---|
| Supervised learning | Trained on labeled examples (input-output pairs); learns to predict outputs for new inputs | Credit scoring (predict default), fraud detection (label transactions as fraudulent/legitimate) |
| Unsupervised learning | Finds patterns in unlabeled data; no predefined outputs | Customer segmentation, anomaly detection in journal entries |
| Reinforcement learning | An agent learns by interacting with an environment and receiving rewards or penalties | Algorithmic trading, dynamic pricing |
Common ML Algorithms in Finance
Regression models predict a continuous outcome:
- Linear regression: Predicts a dependent variable as a linear combination of independent variables. Used for revenue forecasting, expense prediction.
- Logistic regression: Predicts a binary outcome (fraud / not fraud; default / no default). Despite the name, it is a classification algorithm.
Tree-based models are among the most powerful for tabular (structured) financial data:
- Decision trees: Split data recursively on the feature that best separates the classes. Highly interpretable but prone to overfitting.
- Random forests: Ensemble of decision trees; reduces overfitting through averaging. Widely used for credit risk.
- Gradient boosting (XGBoost, LightGBM): Sequentially builds trees to correct the errors of previous trees. State-of-the-art for many financial prediction tasks.
Neural networks excel at unstructured data (text, images) but are increasingly applied to time-series financial data.
Automated Bookkeeping and Transaction Coding
One of the most mature ML applications in accounting is the automatic coding of transactions to the correct account in the chart of accounts. The process works as follows:
- The ML model is trained on historical transactions where a human has assigned the correct account code
- For each new transaction, the model extracts features: merchant name, transaction amount, category code from the bank feed, description text
- The model predicts the most likely account code and confidence score
- High-confidence predictions are auto-coded; low-confidence predictions are routed to a human for review
This capability is built into modern cloud accounting platforms (QuickBooks Online, Xero, Sage) and significantly reduces the time accountants spend on routine bookkeeping.
Predictive Analytics in Finance
Revenue Forecasting
Traditional revenue forecasting relies on extrapolating historical trends and human judgment. ML-based forecasting incorporates a broader set of signals:
- Historical revenue patterns (seasonality, trend, cyclicality)
- Leading indicators (sales pipeline, web traffic, customer acquisition rates)
- External data (economic indicators, industry data, weather for seasonal businesses)
- Pricing data, promotional calendars
Models like ARIMA (Autoregressive Integrated Moving Average) and Prophet (developed by Facebook) are commonly used for time-series forecasting.
Credit Risk Scoring
Credit scoring is one of the oldest and most established ML applications in finance. A credit score is essentially the output of an ML model trained to predict the probability of default:
\[ P(\text{Default}) = f(\text{credit history, income, debt-to-income ratio, employment, ...}) \]Traditional scoring (like FICO scores) used logistic regression on a small set of variables. Modern ML credit models use gradient boosting or neural networks trained on thousands of variables, achieving substantially better predictive accuracy — but at the cost of interpretability, which creates regulatory challenges.
Natural Language Processing in Accounting
Natural Language Processing (NLP) is the branch of AI concerned with enabling computers to understand, interpret, and generate human language. NLP applications in accounting and auditing include:
- Contract review and abstraction: Extracting key terms (parties, duration, payment terms, termination clauses) from legal contracts for audit purposes
- Earnings call analysis: Analyzing the language of earnings calls for sentiment signals that may predict stock price movements or earnings quality
- Regulatory monitoring: Scanning regulatory publications (SEC releases, IASB exposure drafts, CRA bulletins) to identify changes relevant to a client portfolio
- Fraud detection: Analyzing the language of management communications for linguistic signals associated with deception (Larcker and Zakolyukina, 2012)
AI in Audit
The audit profession is increasingly incorporating AI and data analytics into the audit methodology:
Full Population Testing
Traditional audit sampling (e.g., testing 30 of 10,000 transactions) was a necessary compromise when testing every transaction was impractical. Data analytics tools now allow auditors to test the full population of transactions, dramatically improving audit quality:
- Every journal entry can be scanned for characteristics associated with fraud risk (round numbers, weekend postings, postings by terminated employees, offsetting entries)
- Every accounts payable transaction can be matched against vendor master to identify unusual patterns
- Every expense report can be analyzed for unusual merchant categories, duplicate claims, or policy violations
Continuous Auditing and Monitoring
Continuous auditing replaces the traditional year-end audit with ongoing, automated monitoring throughout the year. Data is analyzed in real time or near-real time; exceptions are flagged for human investigation as they occur rather than months after the fact.
The accounting profession is moving toward a model where the annual audit becomes a thin layer of validation on top of continuous monitoring — the auditor’s role shifts from transaction testing to system validation, exception investigation, and professional judgment.
Chapter 11: Blockchain and Distributed Ledger Technology
What is Blockchain?
A blockchain is a distributed, append-only ledger: a record of transactions that is replicated across many computers (nodes) in a network, where each block of transactions is cryptographically linked to the previous block, making the record effectively immutable.
The Cryptographic Foundation
Blockchain security relies on two key cryptographic primitives:
Cryptographic hash functions take any input and produce a fixed-length output (the “hash” or “digest”) with the following properties:
- Deterministic: The same input always produces the same hash
- One-way: It is computationally infeasible to reconstruct the input from the hash
- Avalanche effect: A tiny change in the input (even one character) produces a completely different hash
- Collision resistant: It is computationally infeasible to find two different inputs that produce the same hash
Each block in a blockchain contains the hash of the previous block. If an attacker tries to alter a historical transaction, the hash of that block changes, which invalidates the hash stored in the next block, cascading through the entire chain — making any alteration immediately detectable.
Public-key cryptography is used to sign transactions. Each user has a public key (their address, shareable with anyone) and a private key (a secret string that proves ownership). A transaction is “signed” with the private key, and anyone can verify the signature using the public key — proving that the transaction was authorized by the private key holder without revealing the key itself.
Consensus Mechanisms
Because no central authority validates transactions, blockchain networks use consensus mechanisms — protocols by which network participants agree on which transactions are valid:
Public vs. Private Blockchains
| Dimension | Public Blockchain | Private (Permissioned) Blockchain |
|---|---|---|
| Access | Anyone can read and participate | Only invited participants |
| Governance | Decentralized (community-governed) | Centralized (controlled by an organization or consortium) |
| Transparency | Fully transparent | Limited to participants |
| Speed | Slow (consensus among thousands of nodes) | Fast (fewer nodes; faster consensus) |
| Examples | Bitcoin, Ethereum | Hyperledger Fabric, Corda, Quorum |
| Business use case | Cryptocurrency, DeFi, NFTs | Supply chain provenance, interbank settlement, trade finance |
Smart Contracts
Smart contracts were popularized by the Ethereum blockchain. A simple smart contract might implement an escrow:
- Buyer sends payment to the smart contract address
- Seller ships goods; a trusted oracle (or IoT sensor) confirms delivery
- The smart contract automatically releases payment to the seller upon delivery confirmation
- If delivery is not confirmed within 30 days, the smart contract automatically refunds the buyer
No bank, no legal system, no escrow agent is required — the code enforces the agreement automatically. This has profound implications for trade finance, derivatives settlement, and insurance.
Limitations of Smart Contracts
Despite their promise, smart contracts have significant limitations:
- The oracle problem: Smart contracts can only access on-chain data; connecting to real-world events (e.g., was a shipment delivered?) requires oracles — trusted external data feeds that introduce centralization and trust requirements
- Code is law: A bug in a smart contract cannot be reversed. In 2016, a bug in The DAO smart contract allowed an attacker to drain $60 million worth of Ether — and the only “fix” was a highly controversial hard fork of the Ethereum blockchain
- Legal enforceability: The legal status of smart contracts varies by jurisdiction; not all smart contracts are legally enforceable contracts in the common law sense
Cryptocurrency Accounting under IFRS
IFRS does not have a dedicated standard for cryptocurrency assets. CPA Canada (2018) and IFRS guidance provide direction on how existing standards apply:
Classification
The appropriate accounting treatment depends on the entity’s business model and the nature of the cryptocurrency:
Intangible asset (IAS 38): Most appropriate for entities holding cryptocurrency as a long-term investment without the intent to sell in the ordinary course of business. Cryptocurrencies have no physical substance and convey rights that meet the IAS 38 definition. Measured at cost less impairment (or revaluation model if an active market exists for IAS 38 purposes — which many argue cryptocurrencies satisfy).
Inventory (IAS 2): Appropriate for entities that hold cryptocurrency in the ordinary course of business for sale (e.g., a cryptocurrency exchange, a mining company that sells Bitcoin as its primary output). Measured at lower of cost and net realizable value (NRV), or at fair value less costs to sell for commodity broker-traders (IAS 2.3(b)).
Financial instrument (IFRS 9): Generally not applicable, because cryptocurrencies do not represent a contractual right to receive cash or another financial asset (the definition of a financial asset under IAS 32). However, stablecoins and some structured digital assets may qualify.
Measurement
For intangible assets under the cost model, cryptocurrencies are tested for impairment annually (or when indicators exist). The impairment test compares the carrying amount to the recoverable amount. Given cryptocurrency price volatility, impairment write-downs may be significant.
Under the revaluation model (IAS 38.72), revaluation gains go to other comprehensive income (OCI) as a revaluation surplus; revaluation losses are charged to profit or loss (to the extent they exceed the existing surplus). The revaluation model requires that the asset be measured at fair value by reference to an active market — a condition that major cryptocurrencies like Bitcoin and Ether appear to satisfy.
Auditing Cryptocurrency — Challenges and Considerations
CPA Canada’s guidance (2018) on auditing cryptocurrency assets and transactions identifies several challenges that distinguish crypto audit from conventional audit:
Existence and ownership: Unlike cash in a bank account, there is no third party to confirm. Ownership of a cryptocurrency address is evidenced by control of the private key. The auditor must develop procedures to verify that the entity controls the private keys — a technically complex task.
Completeness: Blockchain transactions are public, but the entity may hold assets across many wallets. Obtaining a complete population of addresses is challenging. The auditor should request a management representation letter attesting to the completeness of the disclosed wallet list.
Valuation: Cryptocurrency prices are highly volatile. Fair value measurement (IFRS 13) requires the price at a specific date, which requires access to reliable market data. The auditor should verify that the rate source (exchange price) is appropriate and consistent with prior periods.
Classification: Whether cryptocurrency is an intangible asset, inventory, or financial instrument depends on the entity’s business model.
Internal controls: Controls over private key management (custody) are critical. Loss of the private key is equivalent to permanent loss of the asset. The auditor should understand the key management infrastructure — whether keys are held in “hot wallets” (online, more convenient, more risk) or “cold wallets” (offline, hardware device, more secure).
Decentralized Finance (DeFi)
Key DeFi protocols and concepts:
- Decentralized exchanges (DEXs): Platforms like Uniswap allow users to trade tokens directly against liquidity pools, without a central order book or exchange operator. Prices are determined algorithmically.
- Lending protocols: Platforms like Aave and Compound allow users to lend cryptocurrency and earn interest, or borrow against cryptocurrency collateral — all governed by smart contracts.
- Stablecoins: Cryptocurrencies pegged to a stable asset (usually USD). Algorithmic stablecoins (like the failed TerraUSD) maintain their peg through incentive mechanisms; collateralized stablecoins (USDC, USDT) are backed by fiat currency reserves.
- Yield farming: A practice in which users move assets between DeFi protocols to maximize returns, exploiting incentive structures (governance token rewards) offered by protocols seeking liquidity.
DeFi introduces entirely new accounting and audit challenges: valuation of governance tokens, accounting for liquidity pool positions, tax treatment of yield farming rewards, and assessing the smart contract risk of DeFi positions.
Chapter 12: Cybersecurity Fundamentals for Accountants
The Threat Landscape
Cybersecurity is increasingly a core concern for accounting and finance professionals — not because accountants become security engineers, but because:
- Financial data is among the most sensitive and valuable data an organization holds
- Accountants often have privileged access to financial systems (ERP, banking platforms, payroll) that are high-value targets
- External auditors assess the adequacy of clients’ IT controls as part of integrated audit engagements
- CFOs and audit committees have governance responsibility for cybersecurity risk
Common Cyber Threats
Phishing is the most common initial attack vector. An attacker sends a fraudulent email that appears to be from a trusted source (a bank, a colleague, the CRA) and tricks the recipient into revealing credentials or installing malware.
Ransomware encrypts the victim’s data and demands payment (usually in cryptocurrency) for the decryption key. Major ransomware attacks on accounting firms and financial institutions have caused hundreds of millions of dollars in damages and business interruption.
Business Email Compromise (BEC) is a sophisticated fraud in which attackers impersonate executives or trusted vendors to trick employees into initiating unauthorized wire transfers or changing payment details. BEC caused losses of over USD $2.7 billion in 2022 (FBI IC3 report).
Insider threats arise from current or former employees who misuse privileged access — whether for financial gain, revenge, or through inadvertent negligence. The accounting department is a high-risk area for insider threats because of the financial system access that accounting staff require.
The CIA Triad Applied to Financial Systems
Security objectives for financial systems map directly to the CIA triad:
- Confidentiality: Financial data (customer records, payroll, M&A plans) must be accessible only to authorized parties
- Integrity: Financial records must not be altered without authorization; the completeness and accuracy of the general ledger must be protected
- Availability: Financial systems must be available when needed — month-end close cannot wait for a ransomware recovery
Key Controls for Financial System Security
Access Controls
Identity and Access Management (IAM) is the discipline of ensuring that the right users have access to the right resources at the right times:
- Authentication: Verifying that a user is who they claim to be. Multi-factor authentication (MFA) — requiring a password and a second factor (SMS code, authenticator app, hardware token) — is the most important single control for preventing unauthorized access.
- Authorization: Defining what authenticated users are allowed to do. Role-based access control (RBAC) assigns permissions by role rather than individually.
- Least privilege: Users should have access to only the systems and data they need to perform their job — nothing more.
- Access reviews: Periodic reviews of user access rights to ensure that accounts are not over-privileged and that access is revoked when employees change roles or leave the organization.
Segregation of Duties
Segregation of duties (SoD) is a fundamental internal control that divides a process among multiple individuals so that no single person can execute and conceal an error or fraud. In financial systems, classic SoD separations include:
- Creating vendors vs. approving payments to vendors
- Recording transactions vs. reconciling accounts
- Requesting and approving purchase orders
- Custody of assets vs. recording of assets
ERP systems enforce SoD through role-based access controls — a user assigned to the “Accounts Payable Clerk” role should not also have the “Vendor Master Maintenance” role.
SOC 1 and SOC 2 Reports
System and Organization Controls (SOC) reports are attestation reports issued by independent auditors on the controls at a service organization — a third-party provider whose services affect its customers’ financial reporting or data security.
| Dimension | SOC 1 | SOC 2 |
|---|---|---|
| Focus | Controls over financial reporting | Security and data protection |
| Criteria | Criteria defined by service organization (ICFR) | AICPA Trust Services Criteria |
| Primary audience | User entity auditors | Customers and prospects evaluating the service provider |
| Report types | Type 1 (design only) / Type 2 (design + operating effectiveness over a period) | Type 1 / Type 2 |
Complementary User Entity Controls (CUECs)
SOC reports typically include a list of Complementary User Entity Controls (CUECs) — controls that the user entity must implement to achieve the control objectives described in the SOC report. For example, an ADP SOC 1 report might require the user entity to:
- Review and approve payroll reports before payment is processed
- Promptly notify ADP of employee terminations to prevent payment to departed employees
- Maintain access controls over who can update employee records submitted to ADP
The auditor must verify that the user entity has implemented the CUECs; if they are missing, the assurance provided by the SOC report is diminished.
The Role of Auditors in Cybersecurity
External auditors engage with cybersecurity in two distinct ways:
Within an integrated audit: When auditing a company with significant IT systems, the auditor must assess IT general controls (access management, change management, computer operations, program development) and application controls (input controls, processing controls, output controls) for systems relevant to financial reporting. Weak IT controls mean the auditor cannot rely on automated controls and must expand substantive testing.
As a standalone engagement: Companies increasingly engage auditors to perform SOC 2 examinations, penetration testing oversight, or cybersecurity risk assessments — standalone services that provide assurance on the security posture of the organization.
Chapter 13: IT Governance Frameworks
Why IT Governance Matters
IT governance is the framework of policies, processes, and structures that ensures an organization’s IT systems are aligned with business strategy, deliver value, manage risk appropriately, and use resources efficiently.
For accounting and finance professionals, IT governance matters because:
- IT systems underpin the financial reporting process; weak governance increases the risk of material misstatement
- Technology investments are major capital allocation decisions; governance determines whether those decisions create or destroy value
- Regulators (PCAOB, OSFI, OSSC) expect strong IT governance as part of overall internal control
COBIT
COBIT 2019 is organized around a core model with 40 governance and management objectives grouped into five domains:
- Evaluate, Direct, and Monitor (EDM): Governance objectives — the board and executives evaluate options, direct management to implement plans, and monitor achievement
- Align, Plan, and Organize (APO): Strategy, architecture, innovation, portfolio, budget, workforce
- Build, Acquire, and Implement (BAI): Managing IT solutions through their development and implementation lifecycle
- Deliver, Service, and Support (DSS): Operational IT management — managing operations, service requests, incidents, problems, and continuity
- Monitor, Evaluate, and Assess (MEA): Monitoring performance, conformance, and quality
For financial auditors, the most relevant COBIT objectives relate to IT General Controls (ITGCs):
| ITGC Category | COBIT Alignment | Example Control |
|---|---|---|
| Access management | APO13 (security management), BAI09 (asset management) | Quarterly user access reviews; MFA enforcement |
| Change management | BAI06 (managing IT changes), BAI07 (managing IT change acceptance) | Formal change request, approval, and testing procedures |
| Computer operations | DSS01 (managing operations), DSS04 (managing continuity) | Automated monitoring of batch job completion; backup testing |
| Program development | BAI03 (managing solutions identification and build) | Code review, separation of development/production environments |
ITIL
ITIL is organized around a Service Value System that describes how all components of an organization work together to enable value co-creation. Key ITIL practices relevant to financial auditors:
- Change enablement: Ensures that IT changes are assessed, authorized, planned, and reviewed. Strong change management prevents unauthorized changes to financial systems.
- Incident management: Defines how IT disruptions (system outages, data corruption) are identified, prioritized, and resolved. Effective incident management limits the financial statement impact of IT failures.
- Problem management: Goes beyond incident management to identify and address root causes, preventing recurring incidents.
- Service level management: Defines and monitors the service levels that IT commits to deliver (e.g., 99.9% system availability for the ERP). SLA compliance directly affects the reliability of the financial close process.
COBIT vs. ITIL: A Comparison
| Dimension | COBIT | ITIL |
|---|---|---|
| Primary focus | IT governance (what to govern) | IT service management (how to manage services) |
| Audience | Board, executives, auditors | IT operations staff, service managers |
| Granularity | High-level governance objectives | Detailed process guidance |
| Orientation | Control and accountability | Service delivery and improvement |
| Relationship | Complementary — COBIT defines the governance framework; ITIL provides operational practice guidance |
Technology Risk Management
Technology risk is the risk that technology-related failures, inadequacies, or disruptions cause financial loss, reputational damage, or failure to achieve business objectives. Technology risk is a subset of operational risk under the Basel III regulatory framework.
The risk management process for technology risk follows the standard enterprise risk management (ERM) cycle:
- Risk identification: What technology-related events could harm the organization? (System failure, cybersecurity breach, data loss, vendor failure, project failure, regulatory non-compliance)
- Risk assessment: For each identified risk, assess likelihood and potential impact. This allows risks to be prioritized.
- Risk response: For each priority risk, select a response: avoid, reduce (implement controls), transfer (insurance, outsourcing), or accept.
- Risk monitoring: Continuously monitor the effectiveness of controls and the evolution of the risk environment.
| Risk | Likelihood | Impact | Inherent Risk | Controls | Residual Risk |
|---|---|---|---|---|---|
| ERP system outage during month-end close | Low | High | Medium | Disaster recovery plan; database replication; vendor SLA | Low |
| Ransomware attack on financial servers | Medium | Very High | High | MFA; network segmentation; backup and recovery testing; EDR software | Medium |
| Unauthorized access to GL by terminated employee | Low | High | Medium | Automated deprovisioning; quarterly access reviews | Low |
| Third-party payroll provider data breach | Low | High | Medium | SOC 1 review; vendor security assessment; contractual data breach notification requirements | Low |
Chapter 14: AI Ethics, Governance, and the Future of Work
The Alignment Problem
The alignment problem refers to the challenge of ensuring that AI systems pursue goals that are consistent with human values and intentions. As AI systems become more capable, the potential consequences of misalignment grow.
For business practitioners, alignment concerns are practical as well as philosophical:
- A credit-scoring AI trained on historical data may perpetuate past discrimination against protected groups
- A recommendation system optimized for short-term revenue may erode long-term customer trust
- An automated trading system may generate systemic risk through correlated behavior with similar systems
AI Regulation — The EU AI Act
The European Union’s AI Act (Regulation (EU) 2024/1689) establishes a risk-based regulatory framework for AI systems — the first comprehensive AI legislation of its kind:
| Risk Category | Examples | Regulatory Treatment |
|---|---|---|
| Unacceptable risk | Social scoring by governments, real-time biometric surveillance in public spaces | Prohibited |
| High risk | Credit scoring, recruitment, medical devices, critical infrastructure | Pre-market conformity assessment, data governance requirements |
| Limited risk | Chatbots, deepfake generation | Transparency obligations (must disclose AI nature) |
| Minimal risk | Spam filters, AI in video games | No specific obligations |
For accounting and finance professionals, high-risk AI applications include credit risk assessment tools and employment decision systems — areas where algorithmic decisions can have significant adverse effects on individuals.
General Purpose AI (GPAI)
The EU AI Act introduced a new category — General Purpose AI (GPAI) — for powerful foundation models (like GPT-4, Claude, Gemini) that can be used across many tasks. GPAI providers must:
- Maintain technical documentation
- Provide information to downstream providers who build applications on top of the model
- For systemic risk GPAI models (above a compute threshold): conduct model evaluations, adversarial testing (red teaming), and report serious incidents to regulators
Red Teaming and AI Safety
Red teaming refers to the practice of deliberately attempting to elicit harmful, incorrect, or unsafe outputs from an AI system before deployment — the adversarial testing analog of penetration testing in cybersecurity.
Red teaming exercises help identify:
- Jailbreaks: Prompts that bypass the system’s safety constraints
- Hallucination triggers: Prompts that reliably cause the model to generate false information confidently
- Bias manifestations: Prompts that reveal discriminatory outputs in specific demographic contexts
- Data leakage: Whether the model can be induced to reproduce training data (potentially including confidential information)
For organizations deploying AI in regulated industries, red teaming is becoming a regulatory expectation, not merely a best practice.
Impact on Employment and the Accounting Profession
The impact of AI on employment is a contested empirical question. The standard economic view is that technology displaces specific tasks rather than entire jobs, and that new technologies historically create new categories of work that partially or fully offset task displacement.
For the accounting and finance profession specifically:
- High automation risk (within 5–10 years): Routine data entry, reconciliation, standard report generation, basic tax return preparation
- Lower automation risk: Complex judgment-based analysis, client relationships, ethical decision-making, interpretation of ambiguous regulatory requirements, assurance engagements
- New roles created: AI governance and compliance, data quality oversight, AI model auditing, human-AI workflow design
The appropriate response for aspiring accounting and finance professionals is not to resist technological change but to actively develop fluency in AI tools while doubling down on the uniquely human capabilities — ethical judgment, professional skepticism, communication, and complex reasoning — that remain difficult to automate.
AI Compliance Framework
IBM’s AI compliance framework (McGrath and Jonker, 2024) outlines the key dimensions of responsible AI deployment in an organizational context:
- Governance: Establishing accountability (who owns AI decisions), policies for AI use, and oversight mechanisms
- Transparency: Documenting AI systems’ capabilities, limitations, and training data; making model logic explainable where possible
- Fairness: Testing for discriminatory outcomes across demographic groups; implementing bias mitigation measures
- Privacy: Ensuring AI systems comply with applicable data protection laws (PIPEDA in Canada, GDPR in the EU)
- Security: Protecting AI systems from adversarial attacks and unauthorized access
- Reliability: Monitoring deployed AI systems for performance degradation, distributional shift, and unexpected behaviors
For accounting and finance organizations, the audit committee and board are increasingly expected to provide oversight of AI governance as part of their broader enterprise risk management responsibilities.
Chapter 15: Emerging Technologies
Internet of Things (IoT)
The IoT generates a continuous stream of real-world data that is transforming industries:
- Supply chain: RFID tags and GPS sensors track inventory in real time, providing granular visibility into the location and condition of goods in transit. This data flows into ERP and warehouse management systems automatically.
- Manufacturing: Sensors on production equipment monitor temperature, vibration, and output rates. Machine learning models analyze sensor data to predict equipment failures before they occur (predictive maintenance), reducing downtime and maintenance costs.
- Retail: Smart shelves detect inventory levels and automatically trigger replenishment orders; point-of-sale systems generate real-time sales data that updates inventory records and financial systems.
- Real estate: Smart building systems (HVAC, lighting, access control) generate operational data that can be used to optimize energy costs — a direct impact on the P&L.
IoT and Accounting
The accounting implications of IoT are significant:
- Revenue recognition: IoT-connected products may shift revenue models from one-time sales to usage-based subscription revenue (IFRS 15 requires careful analysis of performance obligations in these arrangements)
- Inventory valuation: Real-time inventory data reduces the estimation required for inventory counts; continuous monitoring may ultimately enable real-time inventory accounting
- Asset accounting: IoT sensor data provides objective evidence of asset condition and usage, potentially improving the basis for depreciation estimates and impairment testing
- Audit evidence: IoT data provides a continuous, automated record of physical events — an emerging source of third-party audit evidence that does not require manual confirmation
Quantum Computing
Quantum computers are not simply faster classical computers — they excel at a specific class of problems involving optimization and factoring large numbers. This has significant implications for:
Cryptographic Risk
Most public-key cryptography (including RSA and ECC, which underlie HTTPS, digital signatures, and blockchain security) relies on the computational difficulty of factoring large numbers. A sufficiently powerful quantum computer running Shor’s algorithm could factor these numbers in polynomial time, breaking current cryptographic infrastructure.
This is not an immediate threat — current quantum computers (NISQ devices) have far too few stable qubits to run Shor’s algorithm at meaningful scales. However, the threat is long-term:
- “Harvest now, decrypt later”: Adversaries may be collecting encrypted communications today with the intent to decrypt them when quantum computers become capable. This is a particular concern for data with long confidentiality requirements (classified government information, long-dated financial contracts).
- Post-quantum cryptography: NIST finalized its first set of post-quantum cryptographic standards in 2024 (FIPS 203, 204, 205). Organizations are beginning the multi-year process of migrating their cryptographic infrastructure to quantum-resistant algorithms.
Optimization Applications in Finance
For financial optimization problems — portfolio optimization, derivatives pricing, logistics routing — quantum computing offers potential advantages:
- Portfolio optimization: Finding the optimal allocation across thousands of assets subject to complex constraints is a combinatorial problem that grows exponentially with the number of assets. Quantum algorithms (like QAOA) may offer speedups.
- Monte Carlo simulation: Quantum algorithms offer a provable quadratic speedup for certain Monte Carlo-style problems used in derivatives pricing and risk modeling.
- Fraud detection: Quantum machine learning may offer speedups for training complex anomaly detection models on large financial datasets.
Summary: Technology Across the Accounting and Finance Function
The following table summarizes how the technologies covered in this course map to the key domains of accounting and finance practice:
| Technology | Financial Reporting | Audit | Tax | Corporate Finance | Compliance |
|---|---|---|---|---|---|
| Cloud ERP | Real-time consolidation; IFRS 16 lease recognition | IT general controls; controls reliance | Automated tax calculation; tax data extraction | Real-time cash visibility; FP&A | Access controls; audit trail |
| RPA | Automated journal entries; month-end close | Full population testing support | Tax return population | Automated report generation | Automated compliance reporting |
| Data analytics / BI | Variance analysis dashboards | Anomaly detection; SoD analysis | Tax risk analytics | Revenue forecasting; M&A analysis | Regulatory reporting |
| AI / ML | Automated XBRL tagging; disclosure drafting | Continuous auditing; fraud detection | AI-assisted transfer pricing | Credit risk models | Regulatory document review |
| Blockchain | Cryptocurrency IFRS treatment | Cryptocurrency audit procedures | Crypto tax reporting | Digital asset custody | Smart contract compliance |
| Cybersecurity | Protecting financial data integrity | SOC 1/2 review; ITGC assessment | Protecting taxpayer data | Protecting M&A information | OSFI; GDPR; PIPEDA compliance |
| IoT | Real-time inventory accounting | Third-party evidence source | Usage-based revenue recognition | Asset condition monitoring | ESG data collection |
| Quantum computing | Long-term cryptographic risk | Quantum-vulnerable control assessment | Data confidentiality | Portfolio optimization (future) | Cryptographic migration planning |