My Functional Programming Awakening: Patterns I'd Been Using All Along

#functional-programming#javascript#combinators#monads#parsers#composition#closures
functional programming javascript

In June 2023, I took David Beazley’s online course “The Functions” - a one-day, six-hour deep dive into functional programming concepts. While taught in Python, it opened my eyes to patterns I’d been unconsciously using in JavaScript for years. This post captures my journey through those concepts, reimagined in JavaScript.

The Mental Model: Functions as Substitution

The first revelation was understanding functional programming as fundamentally about substitution, just like template strings.

const name = "Alice";
const age = 30;
const message = `Hello ${name}, you are ${age} years old`;
console.log(message);
Output: "Hello Alice, you are 30 years old"

Functions work the same way. When you call square(5), you’re essentially substituting the parameter:

const square = (x) => x * x;
console.log(square(5));
Output: 25
// square(5) becomes 5 * 5

This substitution model becomes the foundation for everything else. The key insight is that functions should be pure transformations with inputs going in and outputs coming out, nothing else.

Note: JavaScript’s substitution model is complicated by eager evaluation and mutable objects, but the mental model still provides a powerful foundation for reasoning about functional code.

Breaking Free from Global Dependencies

I encountered a theater pricing problem that was riddled with global variables:

// Problematic approach
let basePrice = 5.0;
let baseAttendance = 120;
let fixedCost = 180;

function computeProfit(price) {
    const attendees = baseAttendance - (price - basePrice) * 150;
    const revenue = attendees * price;
    const cost = fixedCost + 0.04 * attendees;
    return revenue - cost;
}

console.log(computeProfit(6));
Output: -358.8 (negative profit!)

The problem? This function has hidden dependencies. Testing becomes a nightmare, and composition is nearly impossible.

The insight was to make functions completely self-contained. Here’s the JavaScript solution:

// Better: Functions creating functions
function makeProfitFunction(config = {}) {
    const {
        basePrice = 5.0,           // USD per ticket
        baseAttendance = 120,      // people at base price
        attendeesPerDollar = 150,  // people lost per $1 price increase
        fixedCost = 180,           // USD fixed cost per show
        costPerAttendee = 0.04     // USD variable cost per person
    } = config;

    return function computeProfit(price) {
        const attendees = baseAttendance - (price - basePrice) * attendeesPerDollar;
        const revenue = attendees * price;
        const cost = fixedCost + costPerAttendee * attendees;
        return revenue - cost;
    };
}

// Usage
const profitFunc = makeProfitFunction({ basePrice: 6.0, fixedCost: 200 });
console.log(profitFunc(6)); // Output: 515.2
console.log(profitFunc(5)); // Output: 1139.2

This pattern creates closures that remember their configuration. Each profit function becomes a self-contained universe with no external dependencies.

Functions as First-Class Citizens

One of the most powerful realizations was that functions are just data. You can pass them around, store them in arrays, and compose them freely:

const operations = [
    x => x + 10,
    x => x * 2,
    x => x * x
];

// Apply operations in sequence
function compose(value, ops) {
    return ops.reduce((acc, op) => op(acc), value);
}

console.log(compose(2, operations));
Output: 576
// Step by step: (2 + 10) * 2 = 24, then 24 * 24 = 576

This opened up possibilities I hadn’t considered before. Functions become building blocks that can be combined in endless ways.

Next, we’ll see how this same principle can build an entire parser from tiny, composable pieces—prepare for some mind-bending function composition!

The Parser Revolution: Combinator Pattern

The most mind-bending discovery was building a parser using combinators—functions that take other functions as input and return new functions as output. The idea is that small parsing functions can be combined to handle complex grammar.

// Basic building blocks
function parseDigits(text, index) {
    let n = index;
    while (n < text.length && /\d/.test(text[n])) {
        n++;
    }
    return n > index ? [text.slice(index, n), n] : null;
}

function parseLetters(text, index) {
    let n = index;
    while (n < text.length && /[a-zA-Z]/.test(text[n])) {
        n++;
    }
    return n > index ? [text.slice(index, n), n] : null;
}

console.log(parseDigits("123abc", 0));  // Output: ["123", 3]
console.log(parseLetters("abc123", 0)); // Output: ["abc", 3]
console.log(parseDigits("abc", 0));     // Output: null

Building Parser Generators

But here’s where the magic happens. Instead of having separate functions, we can create a parser generator—a combinator that creates parsers:

function matchingPredicate(predicate) {
    return function parse(text, index) {
        let n = index;
        while (n < text.length && predicate(text[n])) {
            n++;
        }
        return n > index ? [text.slice(index, n), n] : null;
    };
}

// Now we can generate parsers
const parseDigits = matchingPredicate(c => /\d/.test(c));
const parseLetters = matchingPredicate(c => /[a-zA-Z]/.test(c));
const parseAlphaNum = matchingPredicate(c => /[a-zA-Z0-9]/.test(c));

console.log(parseDigits("123abc", 0));  // Output: ["123", 3]
console.log(parseLetters("abc123", 0)); // Output: ["abc", 3]

Sequencing and Choice

The real breakthrough came with sequence and choice combinators:

function sequence(...parsers) {
    return function parse(text, index) {
        const results = [];
        let currentIndex = index;

        for (const parser of parsers) {
            const result = parser(text, currentIndex);
            if (!result) return null;

            const [value, newIndex] = result;
            results.push(value);
            currentIndex = newIndex;
        }

        return [results, currentIndex];
    };
}

function choice(...parsers) {
    return function parse(text, index) {
        for (const parser of parsers) {
            const result = parser(text, index);
            if (result) return result;
        }
        return null;
    };
}

function literal(char) {
    return function parse(text, index) {
        if (index < text.length && text[index] === char) {
            return [char, index + 1];
        }
        return null;
    };
}

Now we can build complex parsers by composition:

// Parse a setting like "name=42;"
const parseSetting = sequence(
    parseLetters,
    literal('='),
    parseDigits,
    literal(';')
);

console.log(parseSetting("speed=75;", 0));
// Output: [["speed", "=", "75", ";"], 9]

console.log(parseSetting("name=123;more", 0));
// Output: [["name", "=", "123", ";"], 9]

console.log(parseSetting("invalid", 0));
// Output: null

// Consuming parser results with destructuring
const [ast, nextIndex] = parseSetting("speed=75;", 0) || [null, 0];
if (ast) {
    console.log(`Parsed: ${ast[0]}=${ast[2]}, continuing at index ${nextIndex}`);
    // Output: "Parsed: speed=75, continuing at index 9"
}

Transforming Results with Reduce

The raw parser output isn’t very useful. We need to transform it:

function reduce(parser, transformer) {
    return function parse(text, index) {
        const result = parser(text, index);
        if (!result) return null;

        const [value, newIndex] = result;
        return [transformer(value), newIndex];
    };
}

// Transform the raw parsing result into a useful object
const parseSettingObject = reduce(
    parseSetting,
    ([name, , value, ]) => ({ [name]: parseInt(value) })
);

console.log(parseSetting("speed=75;", 0));
// Output: [["speed", "=", "75", ";"], 9]

console.log(parseSettingObject("speed=75;", 0));
// Output: [{ speed: 75 }, 9]

Error Handling: The Result Pattern

I discovered an elegant solution to the messy problem of error handling in interfaces. Instead of throwing exceptions that muddy the waters, return a Result object:

class Result {
    constructor(value = null, error = null) {
        this.value = value;
        this.error = error;
    }

    static success(value) {
        return new Result(value, null);
    }

    static failure(error) {
        return new Result(null, error);
    }

    unwrap() {
        if (this.error) {
            throw this.error;
        }
        return this.value;
    }

    isSuccess() {
        return this.error === null;
    }
}

// Example usage
const successResult = Result.success(42);
console.log(successResult.unwrap());     // Output: 42
console.log(successResult.isSuccess());  // Output: true

const failureResult = Result.failure(new Error("Something went wrong"));
console.log(failureResult.isSuccess());  // Output: false

try {
    failureResult.unwrap();
} catch (error) {
    console.log(error.message);  // Output: "Something went wrong"
}

Chaining Operations: The Monadic Pattern

The most mind-bending concept was chaining operations that might fail. The Result pattern breaks normal function composition because functions don’t know how to handle Result objects.

The solution? Make Result objects chainable:

class Result {
    // ... previous methods ...

    map(fn) {
        if (this.error) {
            return this; // Short-circuit on error
        }

        try {
            return Result.success(fn(this.value));
        } catch (error) {
            return Result.failure(error);
        }
    }

    flatMap(fn) {
        if (this.error) {
            return this;
        }

        try {
            return fn(this.value);
        } catch (error) {
            return Result.failure(error);
        }
    }
}

// Now we can chain operations elegantly
const add10 = x => x + 10;
const double = x => x * 2;
const square = x => x * x;

const result = Result.success(2)
    .map(add10)
    .map(double)
    .map(square);

console.log(result.unwrap()); // 576

This pattern allows you to build pipelines where errors automatically propagate without manual checking at each step.

Interface Design: The Arrow Problem

I learned that most programming complexity comes from the “arrow” between functions. How do inputs flow in? How do outputs flow out? What about errors?

Consider this function signature design challenge:

// Option 1: Simple but inflexible
function delay(seconds, fn) {
    setTimeout(fn, seconds * 1000);
}
delay(5, () => console.log("Hello")); // Only works with no-arg functions

// Option 2: Pass arguments separately
function delay(seconds, fn, args = []) {
    setTimeout(() => fn(...args), seconds * 1000);
}
delay(5, Math.max, [10, 20]);

// Option 3: Use rest parameters
function delay(seconds, fn, ...args) {
    setTimeout(() => fn(...args), seconds * 1000);
}
delay(5, Math.max, 10, 20);

// Option 4: Return a function (currying)
function delay(seconds, fn) {
    return (...args) => {
        setTimeout(() => fn(...args), seconds * 1000);
    };
}
const delayedMax = delay(5, Math.max);
delayedMax(10, 20);

Each approach has trade-offs. Option 1 is simple but limiting. Option 3 mixes concerns on the function boundary and can hide the actual function behind variadic arguments in stack traces, making debugging harder. Option 4 separates concerns cleanly but requires two function calls.

Practical Applications in Modern JavaScript

These patterns show up everywhere in modern JavaScript:

Promise chains are monadic:

fetch('/api/user')
    .then(response => response.json())
    .then(user => user.profile)
    .catch(error => console.error(error));

Array methods use higher-order functions:

const numbers = [1, 2, 3, 4, 5];
const result = numbers
    .filter(x => x % 2 === 0)  // [2, 4]
    .map(x => x * 2)           // [4, 8]
    .reduce((sum, x) => sum + x, 0);  // 12

console.log(result); // Output: 12

// Each step transforms the data:
console.log(numbers.filter(x => x % 2 === 0));  // Output: [2, 4]
console.log(numbers.filter(x => x % 2 === 0).map(x => x * 2));  // Output: [4, 8]

React hooks embrace closures and function composition:

function useCounter(initialValue = 0) {
    const [count, setCount] = useState(initialValue);

    const increment = useCallback(() => setCount(c => c + 1), []);
    const decrement = useCallback(() => setCount(c => c - 1), []);

    return { count, increment, decrement };
}

Key Takeaways

My journey through functional programming revealed patterns I’d been using intuitively but never understood deeply:

  1. Composition over inheritance: Small, focused functions that can be combined beat large, monolithic ones.

  2. Pure functions are predictable: No side effects means easier testing and reasoning.

  3. The arrow matters: How functions connect is often more important than what they do internally.

  4. Error handling can be elegant: Result patterns make error propagation explicit and composable.

  5. Code generation is powerful: Functions that create functions open up new architectural possibilities.

This experience didn’t advocate for pure functional programming. Instead, it showed how functional concepts can make any codebase more robust and composable. In JavaScript, we have the flexibility to blend these patterns with other approaches as needed.

These insights from Beazley’s course have made me more intentional about function design and composition in my daily work. The patterns feel natural in JavaScript’s functional-friendly environment. They’ve helped me write more maintainable and testable code.