Sunday, 18 October 2015

4G RACH optimisation through SON

The scope of LTE SON (Self Optimising Network) is vast and although reading through the standard might make one think that images like the one above are a thing of the past, what vendors offer and operators deploy today is fairly limited. The one that is very obvious in use is ANR (Automatic Neighbour Relations) and true to its name it has made neighbour planning a thing of the past in LTE deployments.

I was recently looking through some air interface logs of one UK operator and I was pleasantly surprised to see one more SON feature in use, namely RACH reporting. RACH performance historically has relied on drive testing to quantify, as a failed procedure is not recorded by the network. With RACH reporting the UE (if supported) can be requested to report how many preambles it used to access the network and if it encountered any contention. In a simple solution this information could simply be recorded statistically for an operator to look at, or for a full SON solution the requested preamble power could be adjusted (either way depending if UEs are reporting too many preambles or too few) or more RACH signatures could be assigned if contention is widely reported (typically the signature pool is statically split between contention based and contention free usage) or some other relevant parameter based change could be automatically triggered.

The procedure itself starts with the UE reporting its support as shown below (click to expand).
The network can then request from the UE to report on the result of the RACH procedure through the UE information request procedure as shown below.
Finally the UE reports the result in the UE Information Response message. In this particular example, two preambles had to be sent as the result of the first one was met with contention.
A simple solution, to an age old mobile network problem. Quite good..

Tuesday, 12 August 2014

Small cells out in the open

As mentioned in previous posts I am a big fan of small cells/femto cells, so it was great to see Vodafone in the UK using the product in a novel way. Essentially they are deploying these in small rural communities with no existing macro coverage, but rather that the more typical operator led installation, they are asking rural communities to contact them and also provide the physical locations for installation and the necessary broadband connectivity. So all Vodafone have to do, is turn up mount the product on a chimney/wall/post and off you go. There is lots more detail here.

These small cells typically radiate around 1W, as compared to the 20-30W of a typical macro base station and can handle around 32 connected users. They are also self configuring (cell ID, PSC, neighbour detection) so require very little or no planning.

Small cells become a lot more interesting (and complicated) when they are deployed in the presence of a macro (so called HetNets), but even so the above story is still very interesting and encouraging to see.

Saturday, 31 May 2014

XLTE & the marketing side of technology

I was recently reading about Verizon's "XLTE" and it got me thinking about the marketing side of technology and specifically mobile technology.

Essentially XLTE is not a new technology, it is just Verizon's deployment of LTE over 20MHz of spectrum. This is something many other countries have deployed from day one, but in the US it has become a big marketing deal. I imagine Verizon paid a lot of money for that additional spectrum and quite a lot to upgrade eNodeBs and antennas, so in order to get a return of investment a big marketing campaign was put into place. But how do you market 20MHz of spectrum? Here in the UK, EE has marketed it as "double speed" (double as their initial deployment was over 10MHz). But I guess that is quite boring. "XLTE" sounds much better.

All this of course is not new. To my recollection, it started with HSDPA. How do you market HSDPA? Surely not as High Speed Downlink Packet Access. A few terms appeared, there was 3G+, 3.5G, Super 3G, Turbo 3G. As HSDPA evolved, we also had HSDPA+ and some operators even called it 4G!

What about WB-AMR? "Do you want a phone that supports Wide Band Adaptive Multi Rate Sir?" Probably not. HD Voice however sounds great.

Needless to say, this will continue. LTE Advanced with Carrier Aggregation is just around the corner (actually launched already in Korea). So, XXXLTE maybe? 4.5G? 5G even? Let's see..

Wednesday, 18 December 2013

Optimal spectrum refarming for LTE

When looking to refarm some spectrum for LTE (e.g. 1800MHz spectrum from GSM) the following simple approach will lead to optimal results.

Start by thinking of how much spectrum you would ideally refarm. This will typically be 20MHz. Assuming this was possible pick the centre frequency for this allocation. This will be your EARFCN. Then look at how much spectrum you can actually refarm. This will typically be less, as the traffic on the legacy RAT might not have reduced enough or frequency re-planning your whole legacy network will take time. Most operators go for 10MHz, but in some cases 5MHz is also used.

Deploy your network.

After some time has passed and more spectrum is available, keep the centre frequency the same and just expand the bandwidth. Some cells might be using 10MHz, some 15MHz or 20MHz but because the centre frequency has not changed, all mobility can be intra-frequency. No need for inter-frequency handovers, no need for additional neighbour planning, no need for measurement gaps, no need for additional SIBs being broadcasted. UEs will seamlesly reselect & handover taking into account the used bandwidth every time as this is broadcasted in the MIB which is read in idle mode and after every handover.

Although the above might sound like the obvious way of doing things, both EE in the UK (see here) and other LTE deployments (see here) don't follow this but rather offset their two bandwidth allocations leading to needless inter-frequency mobility.

Sunday, 8 December 2013

PRACH preamble power considerations in LTE

Unlike UMTS, the PRACH in LTE is used only for the transmission of random access preambles. These are used when the UE wants to access the system from RRC idle, as part of the RRC re-establishment procedure following a radio link failure, during handover or when it finds itself out of sync.

As part of the PRACH procedure the UE needs to determine the power to use for the transmission of the preamble and for this it looks at SIB2 for the preambleInitialReceivedTargetPower IE. As shown from the extract above (taken from a live network) this is expressed in the dBm and in this specific case it is set to -104dBm. So this is the expected power level of the PRACH preamble when it reaches the eNodeB.

What is also broadcasted is the reference signal power, which in our case is set to 18dBm. Based on this and a current measurement of the RSRP, the UE can determine the pathloss. Once it knows the pathloss it can then determine how much power it needs to allocate the PRACH preamble to reach the enodeB at -104dBm.

So lets say that the UE measures an RSRP of -80dBm. Based on the broadcasted reference signal power it can calculate the pathloss, PL = 18 - (-80) = 98dB. This means that for a preamble to reach the eNodeB at -104dBm it needs to be transmitted at PPRACH = -104 + 98 = -6dBm. That is fine.

But what happens if we consider other values of RSRP? For example cell edge? Cell edge can be determined by the value of the qRxLevMin. Looking at SIB1 from the same network we can see that this is set to -128dBm (IE x 2). 

So at an RSRP of -128dBm the pathloss is PL = 18 - (-126) = 144dB. So the UE needs to transmit the preamble at PPRACH = -104 + 144 = 40dBm. Is this ok? Actually no, as LTE UEs are only capable of transmitting at a maximum power of 23dBm. Does this mean the UE does not even go through the PRACH procedure? No, but it will be limited to transmitting at 23dBm meaning that the preamble will reach the eNodeB at - 121dBm, which means that the probability of a successful detection is very low.

In actual fact based on this network we can say that anywhere in the cell where the RSRP is below -109dBm will lead to a power limited PRACH attempt and a lower probability of detection. This is something to think about next time your LTE signal strength is low and your phone seems unresponsive..

Sunday, 27 October 2013

3.2Mbps @ 2003? I don't think so..

Three UK have put up the above graphic on their website, here, depicting their network evolution from a throughput point of view.

It is quite nice to look at but was 3.2Mbps possible in 2003? I don't think so, as at that time only R99 networks were available and the max throughput was 384kbps. This was only increased with the first HSDPA networks in 2005 and even then speeds were limited to 1.8Mbps (category 12 devices).

Let's see how long it takes for them to correct this.. :)

Wednesday, 16 October 2013

Deep dive into commercial LTE networks

I was recently in Greece and took the opportunity to have a closer look at two commercial LTE networks in order to understand what capability and configuration operators are using in the field.

The networks in question were Cosmote and Vodafone. Cosmote has launched 4G since around November 2012, initially limited to PS devices (dongles, Mi-Fi etc) and later on added support for smartphones. Vodafone has had a very limited LTE offering since the end of 2012 also limited to PS devices, but has also since expanded its LTE network and added support for smartphones.

Both operators are using re-farmed 1800MHz spectrum to support 4G services. Each operator seems to have re-farmed 20MHz of spectrum which is split into a 10MHz block and a overlapping 20MHz block. In busy/important areas the 20MHz block is used in other areas the 10MHz block. Obviously as the spectrum is overlapping the two blocks cannot be used in the same geographic area. Details are shown below.
Vendor details are not usually shared in the public domain although occasionally certain vendors/operators do make announcements about awarded contracts. In this particular case I could not find any announcements in the public domain but what I call "L3 signatures" indicate that Vodafone are using Huawei EUTRAN while Cosmote uses NSN.

Idle Mode Selection/re-selection:
Unlike 3G which uses both a quality (Qqualmin) and signal strength (Qrxlevmin) selection criterion, LTE only uses signal strength (at least in rel8). Vodafone have configured their Qrxlevmin to -128dBm RSRP while Cosmote uses -130dBm. Although the standard allows for up to -140dBm, both of these values can be considered quite low. The obvious benefit is that UEs stay in LTE for longer, the obvious question is what is performance like (especially in the uplink) at such low RSRP values?

Specifically to intra-frequency cell re-selection Vodafone UEs start searching when the RSRP is equal or less than -68dBm. They perform the reselection if the neighbour cell is 4dB stronger.

Cosmote UEs start searching when the RSRP is equal or less than -68dBm as well but perform the reselection if the neighbour cell is 2dB stronger.

From an inter-frequency (not applicable for these networks) and IRAT re-selection point of view, LTE uses priorities similar to HCS. The configured priorities are shown below. As can be expected 4G has the highest priority followed by 3G and finally 2G.
Vodafone UEs will start measuring lower priority RATs at -118dBm and reselect when the serving cell RSRP falls below -128dBm.
Cosmote UEs will start measuring lower priority RATs at -124dBm and reselect at the same threshold.

Intra-frequency mobility:
Intra-frequency mobility in LTE is governed by event A3. A comparison of the A3 configuration for each operator is shown below.
 a3-offset is IE x 0.5dB so Vodafone trigger a HO when the neighbour cell is a3-offset + hysteresis stronger, which equates to 3dB. Cosmote also trigger a HO at 3dB however they don't make use of the hysteresis.

Connected mode IRAT mobility:
Both operators use event A2 to trigger IRAT mobility actions. The mobility mechanism itself is through an RRC connection release with redirect (PS handover is not supported). Vodafone use a measurement based approach. Two A2 thresholds are defined. The first at -120dBm RSRP triggers measurements against 3G & 2G neighbour cells. Depending on what is detected by the UE the appropriate re-direct RAT is selected. The second A2 threshold at -126dBm, is used to trigger a blind redirect to 3G directly. This is used when the UE does not report anything back following the first A2 event.

Cosmote on the other hand only use the blind re-direct and the UE is always re-directed to 3G at -124dBm RSRP.

Both operators use "basic" CSFB (i.e no DMCR, no SIB tunneling) to 3G (2100MHz band).
Interestingly enough, Cosmote use a CSFB inter-working function as described here. Although this eliminates any TAC to LAC planning it does create an additional call setup delay as shown from the measurements below.
RRC connection management:
The RRC state machine in 4G is very simple as only two states are defined. Idle and Connected. To transition from RRC_CONNECTED to RRC_IDLE Vodafone use a 5s inactivity timer while Cosmote use a 30s inactivity timer. It can be expected that 5s lead to an increase in signalling while 30s will impact battery life (assuming connected mode DRX is shorter than idle mode DRX which in this case it is).

That is it, quite an interesting deep dive into commercial LTE deployments and it also establishes something of a baseline for other networks I look at.