Scala Programming Guidesscala-ecosystemscala-best-practices

Design Patterns in Scala | Scala Programming Guide

By Dmitri Meshin
Picture of the author
Published on
Design Patterns in Scala - Scala Programming Guide

Creational Patterns

Factory Pattern with Polymorphism

Factory methods encapsulate object creation logic, separating creation concerns from usage. This is particularly valuable in Scala when you have multiple implementations of a trait and want a single place to decide which one to instantiate based on runtime conditions. The factory pattern promotes abstraction: callers depend on the abstract interface, not concrete implementations, making code more testable and maintainable. In real applications, factories often read from configuration, environment variables, or plugin registries to determine which implementation to create. This pattern is essential for creating payment processors, database connections, logging providers, and other pluggable components. By centralizing creation logic in a factory object, you ensure consistency and make it easy to add new implementations without changing client code.

// Payment processor abstraction
sealed trait PaymentProcessor {
  def process(amount: BigDecimal): Either[String, TransactionId]
  def supportsRefunds: Boolean
}

// Concrete implementations
class StripeProcessor(apiKey: String) extends PaymentProcessor {
  override def process(amount: BigDecimal): Either[String, TransactionId] = {
    // Call Stripe API
    Right(TransactionId("stripe_txn_123"))
  }

  override def supportsRefunds: Boolean = true
}

class PayPalProcessor(appId: String) extends PaymentProcessor {
  override def process(amount: BigDecimal): Either[String, TransactionId] = {
    // Call PayPal API
    Right(TransactionId("paypal_txn_456"))
  }

  override def supportsRefunds: Boolean = true
}

class MockProcessor() extends PaymentProcessor {
  override def process(amount: BigDecimal): Either[String, TransactionId] = {
    if (amount > BigDecimal("50000")) {
      Left("Amount exceeds limit")
    } else {
      Right(TransactionId("mock_txn_789"))
    }
  }

  override def supportsRefunds: Boolean = false
}

// Factory object - centralized creation
object PaymentProcessorFactory {
  def create(config: PaymentConfig): Either[String, PaymentProcessor] = {
    config.provider match {
      case "stripe" =>
        config.apiKey match {
          case Some(key) => Right(new StripeProcessor(key))
          case None => Left("Stripe API key not provided")
        }

      case "paypal" =>
        config.appId match {
          case Some(id) => Right(new PayPalProcessor(id))
          case None => Left("PayPal app ID not provided")
        }

      case "mock" =>
        Right(new MockProcessor())

      case provider =>
        Left(s"Unknown payment provider: $provider")
    }
  }
}

// Config loaded from environment or file
case class PaymentConfig(
  provider: String,
  apiKey: Option[String] = None,
  appId: Option[String] = None
)

// Usage
val config = PaymentConfig(
  provider = "stripe",
  apiKey = Some(sys.env("STRIPE_API_KEY"))
)

PaymentProcessorFactory.create(config) match {
  case Right(processor) =>
    processor.process(BigDecimal("99.99")).fold(
      error => println(s"Payment failed: $error"),
      txnId => println(s"Payment succeeded: $txnId")
    )
  case Left(error) =>
    println(s"Factory error: $error")
}

Type-Safe Builder Pattern

Builders provide a fluent, readable way to construct complex objects step-by-step while enforcing type safety at compile time. The traditional Builder pattern from Java becomes even more powerful in Scala through the use of phantom types: sealed traits that track which required fields have been provided. This ensures that you can't construct an incomplete object—the type system prevents it. Builders are ideal for domain objects with many optional fields or complex initialization requirements. Rather than constructor overloading (which becomes unmanageable) or a single constructor with many parameters, builders let callers specify only the fields they care about. This section demonstrates a type-safe builder that guarantees at compile time that all required fields are set before you can call build().

// Complex domain object
case class ReportConfig(
  title: String,
  dataSource: String,
  filters: Map[String, String],
  columns: List[String],
  sortBy: Option[String],
  limit: Option[Int],
  format: String  // "pdf" or "csv"
)

// Type-safe builder using phantom types
sealed trait HoldsTitle
sealed trait HoldsDataSource
sealed trait HoldsFormat

class ReportBuilder[T] private (
  title: Option[String] = None,
  dataSource: Option[String] = None,
  filters: Map[String, String] = Map(),
  columns: List[String] = List(),
  sortBy: Option[String] = None,
  limit: Option[Int] = None,
  format: Option[String] = None
) {
  // Can only build if all required fields are set
  def build(implicit
    t: T <:< (HoldsTitle with HoldsDataSource with HoldsFormat)
  ): ReportConfig = {
    ReportConfig(
      title = title.get,
      dataSource = dataSource.get,
      filters = filters,
      columns = if (columns.isEmpty) List("*") else columns,
      sortBy = sortBy,
      limit = limit,
      format = format.get
    )
  }

  // Methods that add state and update type parameter
  def withTitle(t: String): ReportBuilder[T with HoldsTitle] = {
    val _ = this
    new ReportBuilder[T with HoldsTitle](
      title = Some(t),
      dataSource = dataSource,
      filters = filters,
      columns = columns,
      sortBy = sortBy,
      limit = limit,
      format = format
    )
  }

  def withDataSource(ds: String): ReportBuilder[T with HoldsDataSource] = {
    val _ = this
    new ReportBuilder[T with HoldsDataSource](
      title = title,
      dataSource = Some(ds),
      filters = filters,
      columns = columns,
      sortBy = sortBy,
      limit = limit,
      format = format
    )
  }

  def withFormat(f: String): ReportBuilder[T with HoldsFormat] = {
    val _ = this
    new ReportBuilder[T with HoldsFormat](
      title = title,
      dataSource = dataSource,
      filters = filters,
      columns = columns,
      sortBy = sortBy,
      limit = limit,
      format = Some(f)
    )
  }

  // Optional fields
  def withFilter(key: String, value: String): ReportBuilder[T] = {
    val _ = this
    new ReportBuilder[T](
      title = title,
      dataSource = dataSource,
      filters = filters + (key -> value),
      columns = columns,
      sortBy = sortBy,
      limit = limit,
      format = format
    )
  }

  def withColumn(col: String): ReportBuilder[T] = {
    val _ = this
    new ReportBuilder[T](
      title = title,
      dataSource = dataSource,
      filters = filters,
      columns = columns :+ col,
      sortBy = sortBy,
      limit = limit,
      format = format
    )
  }

  def withLimit(n: Int): ReportBuilder[T] = {
    val _ = this
    new ReportBuilder[T](
      title = title,
      dataSource = dataSource,
      filters = filters,
      columns = columns,
      sortBy = sortBy,
      limit = Some(n),
      format = format
    )
  }

  def sortBy(field: String): ReportBuilder[T] = {
    val _ = this
    new ReportBuilder[T](
      title = title,
      dataSource = dataSource,
      filters = filters,
      columns = columns,
      sortBy = Some(field),
      limit = limit,
      format = format
    )
  }
}

// Entry point - no types satisfied
object ReportBuilder {
  def apply(): ReportBuilder[Any] = new ReportBuilder[Any]()
}

// Type-safe compilation: requires all fields before build()
val validReport = ReportBuilder()
  .withTitle("Sales Report")
  .withDataSource("sales_db")
  .withFormat("pdf")
  .withColumn("date")
  .withColumn("amount")
  .withFilter("region", "US")
  .withLimit(1000)
  .sortBy("date")
  .build()  // Compiles! All required fields present

// This won't compile - missing required fields
// val incomplete = ReportBuilder()
//   .withTitle("Incomplete")
//   .build()  // ERROR: cannot find evidence

Singleton via Object

Scala objects are singletons by design—the language guarantees that only one instance exists across your entire application. This eliminates the verbose double-checked locking patterns from Java and makes the singleton pattern trivial to implement correctly. In Scala, you don't need special patterns or boilerplate; you simply use an object keyword. This is perfect for shared, stateless resources like configuration managers, logging providers, connection pools, or utility collections. Because the JVM creates the singleton instance lazily (only when first accessed) and thread-safely, you get efficient initialization with no synchronization overhead. Scala makes singletons so easy that they become the natural choice for application-scoped resources.

// Guaranteed single instance across JVM
object DatabaseConnectionPool {
  private val pool = scala.collection.mutable.Queue[Connection]()
  private val poolLock = new java.util.concurrent.locks.ReentrantLock()

  def acquire(): Connection = {
    poolLock.lock()
    try {
      if (pool.nonEmpty) {
        pool.dequeue()
      } else {
        createNewConnection()
      }
    } finally {
      poolLock.unlock()
    }
  }

  def release(conn: Connection): Unit = {
    poolLock.lock()
    try {
      if (conn.isOpen) {
        pool.enqueue(conn)
      }
    } finally {
      poolLock.unlock()
    }
  }

  private def createNewConnection(): Connection = {
    // Complex initialization logic
    new PostgresConnection("localhost", 5432)
  }

  def shutdown(): Unit = {
    pool.foreach(_.close())
    pool.clear()
  }
}

// Usage - always same instance
val conn1 = DatabaseConnectionPool.acquire()
val conn2 = DatabaseConnectionPool.acquire()
// conn1 and conn2 may be same Connection object from pool

Structural Patterns

Adapter via Implicit Conversions

Implicit conversions provide an elegant way to adapt incompatible interfaces at use sites without wrapper classes. When you need to use a library's type where your code expects a different type, an implicit conversion bridges the gap seamlessly. This is particularly valuable when integrating multiple libraries or adapting between old and new APIs. The Scala compiler will automatically apply the conversion wherever it's needed, making code look cleaner and more idiomatic. However, use this pattern carefully: implicit conversions can hide the fact that type transformation is occurring, making code harder to understand. In Scala 3, extension methods are often preferred for clarity, but implicit conversions remain useful for true interface adaptation scenarios.

// Legacy API returns mutable collections
object LegacyAnalytics {
  def getUserMetrics(): java.util.HashMap[String, java.lang.Double] = {
    val map = new java.util.HashMap[String, java.lang.Double]()
    map.put("views", 1000.0)
    map.put("clicks", 150.0)
    map
  }
}

// Modern API expects immutable Scala types
object ReportService {
  def generateMetricsReport(metrics: Map[String, Double]): String = {
    metrics
      .map { case (k, v) => s"$k: $v" }
      .mkString("\n")
  }
}

// Implicit adapter - converts Java to Scala at call site
implicit def javaMapToScalaMap[K, V](
  javaMap: java.util.Map[K, V]
): Map[K, V] = {
  import scala.jdk.CollectionConverters._
  javaMap.asScala.toMap
}

// Seamless integration
val legacyMetrics = LegacyAnalytics.getUserMetrics()
val report = ReportService.generateMetricsReport(legacyMetrics)  // Implicit conversion happens

// Implicit class adds methods to existing types
implicit class OrderOps(order: Order) {
  def discountedTotal(discountRate: Double): BigDecimal = {
    order.total * (1 - discountRate)
  }

  def formatted: String = {
    s"Order ${order.id}: $$${order.total}"
  }

  def isExpired: Boolean = {
    java.time.Instant.now().isAfter(
      order.createdAt.plusSeconds(86400 * 30)  // 30 days
    )
  }
}

// New methods now available on Order
val myOrder: Order = ???
val discounted = myOrder.discountedTotal(0.1)
val display = myOrder.formatted
val expired = myOrder.isExpired

Decorator via Stackable Traits

The Decorator pattern, expressed through Scala's trait composition mechanism, is far more powerful than the traditional Java approach. Rather than wrapping objects, you layer traits that add functionality while delegating to the next level. This "stackable traits" pattern lets you compose behaviors dynamically, enabling features like logging, timing, caching, and authentication to be added to any base implementation without modification. The power lies in the flexibility: you can mix and match traits in any combination, creating orthogonal cross-cutting concerns. This is more maintainable than wrapping objects or using decorators, because traits integrate seamlessly with Scala's type system. Let's see how to build a logging, caching, and timing stack on top of a base service, where each layer adds value without cluttering the core logic.

// Base trait
trait Logger {
  def log(message: String): Unit
}

// Base implementation
class FileLogger(filename: String) extends Logger {
  override def log(message: String): Unit = {
    val file = new java.io.FileWriter(filename, true)
    file.write(s"${java.time.Instant.now()}: $message\n")
    file.close()
  }
}

// Stackable decorator traits - each adds functionality
trait TimestampedLogger extends Logger {
  abstract override def log(message: String): Unit = {
    val timestamped = s"[${System.currentTimeMillis()}] $message"
    super.log(timestamped)
  }
}

trait LeveledLogger extends Logger {
  abstract override def log(message: String): Unit = {
    val leveled = s"[INFO] $message"
    super.log(leveled)
  }
}

trait BufferedLogger extends Logger {
  private var buffer = scala.collection.mutable.Buffer[String]()
  private val bufferSize = 100

  abstract override def log(message: String): Unit = {
    buffer += message
    if (buffer.size >= bufferSize) {
      flush()
    }
  }

  def flush(): Unit = {
    buffer.foreach(super.log)
    buffer.clear()
  }
}

// Compose decorators in any order
val logger1 = new FileLogger("app.log")
  with TimestampedLogger
  with LeveledLogger

val logger2 = new FileLogger("app.log")
  with BufferedLogger
  with TimestampedLogger

// Each composition creates different behavior chain
logger1.log("User login")         // Flows: log -> level -> timestamp -> file
logger2.log("User logout")        // Flows: log -> timestamp -> buffer -> file
logger2.flush()

Composite Pattern for Trees

The Composite pattern elegantly handles tree structures by treating leaf nodes and composite nodes uniformly. This is perfect for modeling hierarchical data: file systems (files and directories), UI component trees, expression parsies, or organizational hierarchies. Rather than checking "is this a leaf or a composite?" at every step, you define an interface (trait) that both implement, letting you write recursive algorithms that work on any shape. Scala's sealed traits and pattern matching make this pattern natural and type-safe. You can define operations that traverse the entire tree structure with minimal boilerplate, making code clear and maintainable. This section builds a complete example of a document structure (chapters, sections, paragraphs) showing how to compose behaviors like counting words or collecting headings across the entire tree.

// Base component trait
sealed trait FileSystemItem {
  def name: String
  def size: Long
  def list(): List[FileSystemItem] = List()
}

// Leaf - single file
case class File(
  name: String,
  content: String
) extends FileSystemItem {
  override def size: Long = content.length.toLong
}

// Composite - directory containing items
case class Directory(
  name: String,
  items: List[FileSystemItem] = List()
) extends FileSystemItem {
  override def size: Long = items.map(_.size).sum

  override def list(): List[FileSystemItem] = items

  def add(item: FileSystemItem): Directory = {
    copy(items = items :+ item)
  }

  def addAll(newItems: List[FileSystemItem]): Directory = {
    copy(items = items ++ newItems)
  }
}

// Operations work on both leaves and composites uniformly
def totalSize(item: FileSystemItem): Long = item match {
  case f: File => f.size
  case d: Directory =>
    d.items.map(totalSize).sum
}

def flattenToFiles(item: FileSystemItem): List[File] = item match {
  case f: File => List(f)
  case d: Directory =>
    d.items.flatMap(flattenToFiles)
}

def printTree(item: FileSystemItem, indent: String = ""): Unit = item match {
  case File(name, _) =>
    println(s"${indent}📄 $name")
  case Directory(name, items) =>
    println(s"${indent}📁 $name/")
    items.foreach(printTree(_, indent + "  "))
}

// Usage
val fs = Directory("root")
  .add(File("readme.txt", "Hello"))
  .add(
    Directory("src")
      .add(File("Main.scala", "object Main..."))
      .add(File("Config.scala", "case class Config..."))
  )
  .add(
    Directory("test")
      .add(File("MainTest.scala", "class MainTest..."))
  )

printTree(fs)
println(s"Total size: ${totalSize(fs)} bytes")
val allFiles = flattenToFiles(fs)
println(s"Found ${allFiles.length} files")

Behavioral Patterns

Strategy via Higher-Order Functions

The Strategy pattern encapsulates different algorithms and makes them interchangeable at runtime. In Java, this typically requires creating a strategy interface and multiple implementations. In Scala, a simple function type is often sufficient: pass the algorithm you want as a parameter. This is called functional composition and is more flexible than class-based strategies. You define your core logic once, and callers provide the specific algorithm (strategy) they need. This is perfect for sorting (pass a comparison function), filtering (pass a predicate), transforming (pass a transformation function), or any scenario where behavior should be pluggable. Higher-order functions as strategies are simpler, more testable, and more composable than their class-based counterparts.

// Sorting strategies
type SortStrategy[T] = (List[T]) => List[T]

val ascendingSort: SortStrategy[Int] = _.sorted
val descendingSort: SortStrategy[Int] = _.sorted.reverse
val evenFirstSort: SortStrategy[Int] = { nums =>
  val (even, odd) = nums.partition(_ % 2 == 0)
  even.sorted ++ odd.sorted
}

// Data processor using strategy
class DataProcessor[T](strategy: SortStrategy[T]) {
  def process(data: List[T]): List[T] = {
    strategy(data)
  }
}

// Switch strategies at runtime
val processor = new DataProcessor(ascendingSort)
processor.process(List(3, 1, 4, 1, 5))  // List(1, 1, 3, 4, 5)

val processorDesc = new DataProcessor(descendingSort)
processorDesc.process(List(3, 1, 4, 1, 5))  // List(5, 4, 3, 1, 1)

// Validation strategies
type ValidationStrategy = (String) => Either[String, Unit]

val emailValidation: ValidationStrategy = email => {
  if (email.contains("@")) Right(())
  else Left("Invalid email")
}

val lengthValidation: ValidationStrategy = input => {
  if (input.length >= 3) Right(())
  else Left("Too short")
}

def validate(input: String, strategies: ValidationStrategy*): Either[String, Unit] = {
  strategies.foldLeft[Either[String, Unit]](Right(())) { (result, strategy) =>
    result.flatMap(_ => strategy(input))
  }
}

validate("user@example.com", emailValidation, lengthValidation) // Right(())
validate("a", emailValidation, lengthValidation) // Left("Too short")

Observer Pattern via Event Streams

The traditional Observer pattern—where observers register callbacks with a subject and get notified of changes—becomes elegant through reactive streams. Instead of imperative callbacks, you model data as streams of events that you can transform, filter, and combine using functional operators. This is far more composable: you can derive new streams from existing ones, handle backpressure naturally, and test event flows like you would test data transformations. Libraries like Akka Streams and Fs2 provide industrial-strength implementations. This section shows how to replace callback-based observers with stream-based reactive patterns, enabling better separation of concerns and more testable code.

import cats.effect._
import fs2._

// Domain events
sealed trait Order Event
case class OrderCreated(orderId: String) extends OrderEvent
case class OrderShipped(orderId: String) extends OrderEvent
case class OrderCancelled(orderId: String) extends OrderEvent

// Publish-subscribe with fs2 Queues
object OrderEventBus {
  private val eventQueue = scala.collection.concurrent.TrieMap[String, Queue[IO, OrderEvent]]()

  def subscribe(topic: String): Stream[IO, OrderEvent] = {
    Stream.eval(
      IO(
        eventQueue.getOrElseUpdate(topic, Queue.unbounded[IO, OrderEvent].unsafeRunSync())
      )
    ).flatMap(_.dequeue)
  }

  def publish(topic: String, event: OrderEvent): IO[Unit] = {
    IO(
      eventQueue
        .get(topic)
        .foreach(_.enqueue1(event).unsafeRunSync())
    )
  }
}

// Observer implementation
def logOrderEvents(): Stream[IO, Unit] = {
  OrderEventBus.subscribe("orders")
    .evalMap { event =>
      IO(println(s"Order event: $event"))
    }
}

def sendNotifications(): Stream[IO, Unit] = {
  OrderEventBus.subscribe("orders")
    .evalMap {
      case OrderCreated(id) =>
        IO(println(s"Sending confirmation for $id"))
      case OrderShipped(id) =>
        IO(println(s"Sending tracking info for $id"))
      case OrderCancelled(id) =>
        IO(println(s"Sending cancellation notice for $id"))
    }
}

// Usage
val observers = Stream(
  logOrderEvents(),
  sendNotifications()
).parJoin(2)

// Publishing events in business logic
val createOrder: IO[Unit] = {
  IO(println("Creating order...")) >>
  OrderEventBus.publish("orders", OrderCreated("ORD-001")) >>
  OrderEventBus.publish("orders", OrderShipped("ORD-001"))
}

Chain of Responsibility

The Chain of Responsibility pattern models workflows where requests pass through a series of handlers, each deciding whether to handle the request or pass it along. This is perfect for logging frameworks (different log levels), request middleware (authentication, validation, transformation), and workflow systems (approval chains). In Scala, you can elegantly express chains using recursion or monadic composition. Each handler can inspect the request, add side effects (logging, metrics), potentially transform it, and then pass control to the next handler. This pattern keeps each handler focused on a single responsibility, making them easy to test and reuse. We'll explore both recursive and monadic implementations, showing when each is appropriate.

// Base handler trait
abstract class RequestHandler {
  protected var next: Option[RequestHandler] = None

  def setNext(handler: RequestHandler): RequestHandler = {
    next = Some(handler)
    handler
  }

  def handle(request: HttpRequest): HttpResponse = {
    if (canHandle(request)) {
      doHandle(request)
    } else {
      next
        .map(_.handle(request))
        .getOrElse(HttpResponse(status = 404, body = "Not Found"))
    }
  }

  protected def canHandle(request: HttpRequest): Boolean
  protected def doHandle(request: HttpRequest): HttpResponse
}

// Concrete handlers
class AuthenticationHandler extends RequestHandler {
  override protected def canHandle(request: HttpRequest): Boolean = {
    request.path.startsWith("/api/")
  }

  override protected def doHandle(request: HttpRequest): HttpResponse = {
    val token = request.headers.get("Authorization")
    token match {
      case Some(_) => next.map(_.handle(request))
        .getOrElse(HttpResponse(status = 200, body = "OK"))
      case None => HttpResponse(status = 401, body = "Unauthorized")
    }
  }
}

class RateLimitHandler extends RequestHandler {
  private var requestCount = scala.collection.mutable.Map[String, Int]()

  override protected def canHandle(request: HttpRequest): Boolean = true

  override protected def doHandle(request: HttpRequest): HttpResponse = {
    val ip = request.remoteAddr
    val count = requestCount.getOrElse(ip, 0) + 1
    requestCount(ip) = count

    if (count > 100) {
      HttpResponse(status = 429, body = "Too Many Requests")
    } else {
      next.map(_.handle(request))
        .getOrElse(HttpResponse(status = 200, body = "OK"))
    }
  }
}

class LoggingHandler extends RequestHandler {
  override protected def canHandle(request: HttpRequest): Boolean = true

  override protected def doHandle(request: HttpRequest): HttpResponse = {
    println(s"${java.time.Instant.now()} ${request.method} ${request.path}")
    next.map(_.handle(request))
      .getOrElse(HttpResponse(status = 200, body = "OK"))
  }
}

// Build chain
val handler = new LoggingHandler()
handler
  .setNext(new AuthenticationHandler())
  .setNext(new RateLimitHandler())

// Process requests through chain
val request = HttpRequest("GET", "/api/users", Map("Authorization" -> "Bearer token"))
val response = handler.handle(request)

Scala-Specific Patterns

Cake Pattern (Dependency Injection)

The Cake Pattern uses self-type annotations to compose modules with dependencies, achieving dependency injection without external frameworks. A module trait declares what dependencies it needs via self-type annotations, and components provide implementations. Composition happens by mixing in traits. This pattern was popular in Scala before modern alternatives like given/using, but it's still valuable for large modular systems. The advantage is compile-time safety and clear dependency graphs. The disadvantage is that the pattern can be hard to understand for newcomers. We'll show how the Cake Pattern works and compare it to modern alternatives, so you can decide when to use it. For new code, Scala 3's given/using is often simpler, but understanding Cake helps you read legacy Scala codebases.

// Component traits
trait UserRepositoryComponent {
  def userRepository: UserRepository

  class UserRepository {
    def findById(id: String): Option[User] = ???
  }
}

trait UserServiceComponent { self: UserRepositoryComponent =>
  def userService: UserService

  class UserService {
    def getUser(id: String): Option[User] = {
      userRepository.findById(id)
    }

    def updateUser(user: User): Boolean = ???
  }
}

trait EmailServiceComponent {
  def emailService: EmailService

  class EmailService {
    def sendWelcome(user: User): Unit = ???
  }
}

trait AuthServiceComponent { self: UserServiceComponent with EmailServiceComponent =>
  def authService: AuthService

  class AuthService {
    def register(email: String, password: String): User = {
      val user = User(java.util.UUID.randomUUID().toString, email)
      userService.updateUser(user)
      emailService.sendWelcome(user)
      user
    }
  }
}

// Production configuration - mix components
object ProductionConfig extends
  UserRepositoryComponent with
  UserServiceComponent with
  EmailServiceComponent with
  AuthServiceComponent {
  val userRepository = new UserRepository()
  val userService = new UserService()
  val emailService = new EmailService()
  val authService = new AuthService()
}

// Test configuration - mock dependencies
object TestConfig extends
  UserRepositoryComponent with
  UserServiceComponent with
  EmailServiceComponent with
  AuthServiceComponent {
  val userRepository = new UserRepository {
    override def findById(id: String): Option[User] = {
      Some(User(id, "test@example.com"))
    }
  }

  val userService = new UserService()
  val emailService = new EmailService {
    override def sendWelcome(user: User): Unit = {
      println(s"[TEST] Would send welcome to ${user.email}")
    }
  }

  val authService = new AuthService()
}

// Usage
val productionAuth = ProductionConfig.authService
val testAuth = TestConfig.authService

Type Class Pattern

Type classes are a design pattern (not a language feature) that lets you extend types with new behavior without modifying their definitions or using wrapper objects. You've learned about type classes as a concept; here we see them as a reusable pattern for solving real design problems. The pattern is so powerful that it's the foundation of libraries like Cats, Scalaz, and Play JSON. By mastering type classes as a pattern, you unlock the ability to build extremely flexible, composable libraries where users can provide custom behavior for their types. This is how Scala frameworks achieve such elegant, maintainable code compared to more rigid OOP approaches. We'll see the pattern in action, showing how it enables libraries to support user-defined types without those types knowing about the library.

// Type class definition
trait Serializer[T] {
  def serialize(value: T): String
  def deserialize(json: String): Either[String, T]
}

// Instances for various types
implicit val intSerializer: Serializer[Int] = new Serializer[Int] {
  override def serialize(value: Int): String = value.toString
  override def deserialize(json: String): Either[String, Int] = {
    try Right(json.toInt) catch { case e => Left(e.getMessage) }
  }
}

implicit val stringSerializer: Serializer[String] = new Serializer[String] {
  override def serialize(value: String): String = s"\"$value\""
  override def deserialize(json: String): Either[String, String] = {
    Right(json.stripPrefix("\"").stripSuffix("\""))
  }
}

// Extension methods via implicit class
implicit class SerializableOps[T](value: T) {
  def serialize(implicit serializer: Serializer[T]): String = {
    serializer.serialize(value)
  }
}

implicit class DeserializableOps(json: String) {
  def deserialize[T](implicit serializer: Serializer[T]): Either[String, T] = {
    serializer.deserialize(json)
  }
}

// Generic function using type class
def toJson[T](values: List[T])(implicit serializer: Serializer[T]): String = {
  values.map(_.serialize).mkString("[", ",", "]")
}

// Usage - no explicit type class selection needed
val numbers = List(1, 2, 3)
val json = toJson(numbers)  // Uses implicit intSerializer

val text = "hello".serialize  // Uses implicit stringSerializer
val restored: Either[String, String] = "\"hello\"".deserialize[String]

Loan Pattern for Resource Management

The Loan Pattern "loans" a resource to a block of code, guaranteeing cleanup happens after the block completes, even if an exception occurs. This is Scala's approach to resource management before try-with-resources and using statements. Higher-order functions make this pattern elegant: you pass a block that receives the resource, execute it, and ensure cleanup happens in a finally block. This ensures resources like file handles, database connections, and locks are always released. Scala 3's try-with-resources provides similar guarantees, but the Loan Pattern remains useful for custom resource types and more complex management scenarios. Understanding this pattern helps you design clean APIs for resource management.

// Generic loan pattern
def loan[R, A](resource: R)(cleanup: R => Unit)(use: R => A): A = {
  try {
    use(resource)
  } finally {
    cleanup(resource)
  }
}

// Specific implementations
def withFile[A](path: String)(operation: java.io.BufferedReader => A): A = {
  val reader = new java.io.BufferedReader(
    new java.io.FileReader(path)
  )
  try {
    operation(reader)
  } finally {
    reader.close()
  }
}

def withDatabaseConnection[A](
  url: String,
  user: String,
  password: String
)(operation: java.sql.Connection => A): A = {
  val conn = java.sql.DriverManager.getConnection(url, user, password)
  try {
    operation(conn)
  } finally {
    conn.close()
  }
}

// Usage
val fileContent = withFile("data.txt") { reader =>
  reader.lines().toArray.map(_.toString).mkString("\n")
}

val queryResult = withDatabaseConnection(
  "jdbc:postgresql://localhost/db",
  "user",
  "password"
) { conn =>
  val stmt = conn.createStatement()
  stmt.executeQuery("SELECT * FROM users")
}

Magnet Pattern for Overloading

The Magnet Pattern solves a problem with Scala's method overloading: without magnets, you'd need multiple overloads for different parameter types, leading to code duplication. Magnets use implicit conversions to unite multiple overloads under a single method. A magnet is a trait with a Result type and an apply method. The compiler's implicit resolution machinery treats different parameter types and converts them to the appropriate magnet instance. This is an advanced pattern that's powerful but complex, and most of the time modern Scala features like union types (Scala 3) are preferable. However, you'll see magnets in production libraries, so understanding the pattern helps you read complex APIs. We'll show a practical example of how magnets enable elegant overloading.

// Problem: multiple overloads are tedious
// Solution: use a magnet type

// Magnet trait
trait LogMagnet {
  def apply(): Unit
}

// Magnet implementations
implicit def stringLogMagnet(msg: String): LogMagnet = {
  () => println(s"[STRING] $msg")
}

implicit def exceptionLogMagnet(ex: Throwable): LogMagnet = {
  () => println(s"[ERROR] ${ex.getMessage}")
}

implicit def intLogMagnet(count: Int): LogMagnet = {
  () => println(s"[COUNT] $count")
}

implicit def tupleLogMagnet(pair: (String, Any)): LogMagnet = {
  () => println(s"[TUPLE] ${pair._1}=${pair._2}")
}

// Single unified method
def log(magnet: LogMagnet): Unit = {
  magnet()
}

// Works with any type via implicit conversion
log("Hello")                    // Uses stringLogMagnet
log(new Exception("Failed"))    // Uses exceptionLogMagnet
log(42)                         // Uses intLogMagnet
log(("user_id", "USR-123"))     // Uses tupleLogMagnet

When NOT to Use Patterns

The best code doesn't use patterns at all. Use patterns to solve real problems, not to satisfy design dogma:

// OVER-ENGINEERED: Pattern overkill
trait AbstractFactoryPatternProducerImpl {
  def getAbstractFactory(): AbstractFactory = ???
}

object AbstractFactoryPatternProducerImplSingleton
  extends AbstractFactoryPatternProducerImpl {
  // ...
}

// BETTER: Direct approach
def getCalculator(calcType: String): Calculator = calcType match {
  case "simple" => SimpleCalculator()
  case "scientific" => ScientificCalculator()
}

// OVER-ENGINEERED: Unnecessary abstraction
class UserProcessor {
  def process(user: User): ProcessedUser = {
    val strategy = new UserProcessingStrategy()
    strategy.execute(user)
  }
}

// BETTER: Just do it
def processUser(user: User): ProcessedUser = {
  User(user.id, user.email.toLowerCase, user.name.trim)
}

// Key principle: YAGNI (You Aren't Gonna Need It)
// Write code for current requirements, not imagined future flexibility.