The Next Horizon: Unleashing Sixth Sense with 6G


    Sep 23, 2022

    This is the sixth blog in our 6G White Paper series looking at how technology will continue to evolve as the world adopts 6G networks.

    Check out the other posts in the series here: 6G: The Next Horizon

    Even with users still becoming accustomed to the release of 5G, it is already insufficient for the requirements of networked sensing.

    6G will open up a world that allows us to create highly-realistic 3D versions of the physical world that provide us with what may seem like a — very realistic — sixth sense.

    By reflecting signals off of objects, 6G will allow us to expand our awareness beyond what is physically around us, with machines able to independently sense what is there, the way things are moving, and even what they are made of.

    Networked sensing creates a completely new type of usage scenario that covers a range of use cases such as localization, imaging, environment reconstruction and monitoring, and gesture and activity recognition. The sensing usage scenario also adds new performance dimensions, such as detection probability, sensing resolution and accuracy.

    Let’s take a closer look at some of the significant and practical ways that networked sensing will shape the future that’s already on the horizon.

    Simultaneous imaging, mapping, and localization

    6G will allow for spatial sensing technology to be used to its full extent, with entire surroundings and their conditions being mapped out in 3D — even surroundings that are completely unknown. Using mmWave or THz bands, imaging, mapping, and localization capabilities will work hand in hand to create a clear virtual representation of the physical world that offers properties that a camera cannot provide.

    The imaging function captures images of the surrounding environment, while the localization function obtains locations of surrounding objects. The mapping function then uses these images and locations to construct a map, which in turn helps the localization function improve the inference of locations.

    Sensing devices with this kind of technology could be used in self-driving cars, drones, and robots, enabling ultra-high resolution and accuracy in all weather conditions (yes, even in the dark) and the ability to see what’s around the corner before the car has even gotten there. Simultaneous imaging, mapping, and localization will also transform the world of XR. When playing a game that seems to take place in a real-life environment, the world would adapt with you as you move around, making the line between what’s real and what’s digital blur even further.

    High-accuracy localization and tracking

    Machines are already prevalent in today’s world, but technology with 6G-empowered localization and tracking capabilities will provide a new level of independence to those machines. Ever wished an appliance would just assemble itself? Or that a glass of wine could near-magically be brought to you from across a crowded room without you having to lift a finger? That will soon be possible. 

    6G will allow for scattered and reflected wireless signals to help robots calculate their position in relation to both moving and non-moving objects. Latency, Doppler, and angular spectrum information will allow robots to navigate themselves down to the centimeter-level, without requiring any coordinates given by a human.

    In addition to high-accuracy localization, applications such as automatic docking and multi-robot cooperation also pose high requirements on relative localization. When a swarm of robots collaboratively lift and carry a complex-shaped mechanical part or a drone docks with a moving vehicle that has a small landing margin, it is critical for each robot or drone to determine its location with respect to others. This localization and tracking technology could be used factories, warehouses, hospitals, and retail shops in sectors such as agriculture and mining.

    Augmented human senses         

    Enabled by ultra-high-resolution imaging, 6G will make the invisible visible. Safe, precise, and low-power augmented human-like sensing capabilities will be able to detect what the naked eye cannot and can be used in the form of portable or wearable devices — or even implanted ones if we so wished.

    By utilizing the penetration characteristics of electromagnetic waves, even a smartphone could detect cables in walls, find pinprick leaks in water pipes or implement contactless flaw detection and quality control in smart factories.

    The larger radio frequency range being explored and leveraging communication waveforms, opens up the possibility for us to even sense and image through materials such as skin, suitcases, and furniture, down to the millimeter-level. Spectrogram recognition is another part of this concept and is based on identifying targets through the spectrogram sensing of their electromagnetic or photonic characteristics. The unique absorption characteristics of different materials can be characterized by THz signals, meaning that it will be able to fully calculate the calories on our plates and provide us with an environmental PM2.5 analysis.

    Gesture and activity recognition

    Humans cannot be everywhere all the time to ensure safety for those who need it most. Imagine being immediately alerted when an elderly or already-injured loved one takes a spill. That’s what 6G could provide.

    By bringing forth device-free gesture and activity recognition by using machine learning, which is key to promoting next-generation human-computer interaction, users will soon be able to conveniently communicate with devices through gestures and actions. Such recognition is divided into two types: macro and micro.

    Macro recognition refers to body movements. One example of this is to automatically supervise patient security in future smart hospitals — for example, to detect falls or monitor rehabilitation exercises. Compared with traditional camera-based monitoring, one of the key benefits of networked is privacy protection. Micro recognition refers to gestures, finger movements, and facial expressions. We will soon be playing piano and drawing pictures in the air but hearing real music and seeing real paintings created by XR in real time.

    Wrapping Up

    Ultimately, sensing technologies will open up opportunities that allows us to broaden our senses beyond what they are capable of today. From mapping to activity recognition, we will soon be able to use networked sensing to see and predict everything that is in fact, already there. We just don’t know it yet.

    Subscribe to this blog to keep pace with this series on 6G – as well as the latest tech – and download the white paper: 6G: The Next Horizon – From Connected Things to Connected Intelligence.

    Disclaimer: Any views and/or opinions expressed in this post by individual authors or contributors are their personal views and/or opinions and do not necessarily reflect the views and/or opinions of Huawei Technologies.


      Leave a Comment

      Posted in


      Posted in