BLE central connection processor usage patterns

This section describes the processor availability and interrupt processing time for the SoftDevice in a central connection event.

Figure 1. Central connection events

In a central connection event, the pattern of SoftDevice processing activity at interrupt priority level 0 is typically as follows:

SoftDevice processing activity in the different priority levels during central connection events is outlined in Table 1. The typical case is seen when receiving GATT write commands writing 20 bytes. The max case can be seen when sending and receiving maximum length packets and at the same time initiating encryption, while having a maximum number of connections and utilizing the Radio Timeslot API and Flash memory API at the same time.

Table 1. Processor usage latency when connected
Parameter Description Min Typical Max
tISR(0),RadioPrepare

Processing preparing the radio for a connection event.

  96 μs 120 μs
tISR(0),RadioStart Processing when starting the connection event.   61 μs 70 μs
tISR(0),RadioProcessing Processing after sending or receiving a packet.   80 μs 150 μs
tISR(0),PostProcessing Processing at the end of a connection event.   270 μs 730 μs
tnISR(0) Distance between connection event interrupts. 30 μs > 155 μs  
tISR(2) Priority level 2 interrupt after a packet is sent or received.   160 μs  

From the table, we can calculate a typical processing time for a central connection event where one packet is sent and received to be

tISR(0),RadioPrepare + tISR(0),RadioStart + tISR(0),RadioProcessing + tISR(0),PostProcessing + 2 * tISR(2) = 827 μs

which means that typically more than 99% of the processor time is available to the application when one peripheral link is established and one packet is sent in each direction with a 100 ms connection interval.


Documentation feedback | Developer Zone | Updated 2016-04-08