Mutex alternatives in swift

22.1k views Asked by At

I have a shared-memory between multiple threads. I want to prevent these threads access this piece of memory at a same time. (like producer-consumer problem)

Problem:

A thread add elements to a queue and another thread reads these elements and delete them. They shouldn't access the queue simultaneously.

One solution to this problem is to use Mutex.

As I found, there is no Mutex in Swift. Is there any alternatives in Swift?

5

There are 5 answers

1
beshio On BEST ANSWER

As people commented (incl. me), there are several ways to achieve this kind of lock. But I think dispatch semaphore is better than others because it seems to have the least overhead. As found in Apples doc, "Replacing Semaphore Code", it doesn't go down to kernel space unless the semaphore is already locked (= zero), which is the only case when the code goes down into the kernel to switch the thread. I think that semaphore is not zero most of the time (which is of course app specific matter, though). Thus, we can avoid lots of overhead.

One more comment on dispatch semaphore, which is the opposite scenario to above. If your threads have different execution priorities, and the higher priority threads have to lock the semaphore for a long time, dispatch semaphore may not be the solution. This is because there's no "queue" among waiting threads. What happens at this case is that higher priority threads get and lock the semaphore most of the time, and lower priority threads can lock the semaphore only occasionally, thus, mostly just waiting. If this behavior is not good for your application, you have to consider dispatch queue instead.

0
A.bee On

Thanks to beshio's comment, you can use semaphore like this:

let semaphore = DispatchSemaphore(value: 1)

use wait before using the resource:

semaphore.wait()
// use the resource

and after using release it:

semaphore.signal()

Do this in each thread.

1
Devanshu Saini On

There are many solutions for this but I use serial queues for this kind of action:

let serialQueue = DispatchQueue(label: "queuename")
serialQueue.sync { 
    //call some code here, I pass here a closure from a method
}

Edit/Update: Also for semaphores:

let higherPriority = DispatchQueue.global(qos: .userInitiated)
let lowerPriority = DispatchQueue.global(qos: .utility)

let semaphore = DispatchSemaphore(value: 1)

func letUsPrint(queue: DispatchQueue, symbol: String) {
    queue.async {
        debugPrint("\(symbol) -- waiting")
        semaphore.wait()  // requesting the resource

        for i in 0...10 {
            print(symbol, i)
        }

        debugPrint("\(symbol) -- signal")
        semaphore.signal() // releasing the resource
    }
}

letUsPrint(queue: lowerPriority, symbol: "Low Priority Queue Work")
letUsPrint(queue: higherPriority, symbol: "High Priority Queue Work")

RunLoop.main.run()
0
saagarjha On

On modern platforms (macOS 10.12+, iOS 10+) os_unfair_lock is a performant, efficient general-purpose mutex, especially if your critical section is short. It's much more lightweight (30x smaller) than a queue and tracks priority, preventing inversions that can occur with a DispatchSemaphore.

As with most low-level synchronization primitives, it needs to have a stable address, so you should either allocate it yourself, or you can use OSAllocatedUnfairLock if available on newer (macOS 13+, iOS 16+) systems. If these aren't an option for you or you're not very comfortable with working with the lock directly, NSLock adds a small amount of overhead but is not a bad alternative. Especially when compared to a queue or semaphore :)

2
Juraj Antas On

You can use NSLock or NSRecursiveLock. If you need to call one locking function from another locking function use recursive version.

class X {
  let lock = NSLock()

  func doSome() {
    lock.lock()
    defer { lock.unlock() }
    //do something here
  }

}