PhD researcher Anthony Wall from the Tyndall National Institute explains the ABCs of ADCs and why analogue technologies remain relevant.
From a very young age, Anthony Wall has been “fascinated by electronics”. He wanted to know “how all those colourful components in our PC at home could work together to make a mouse move or display a picture”.
His curiosity got the better of him and he admits, “I soon started taking electronics apart and building dangerous contraptions of my own, to the torment of my parents”.
It was almost inevitable Wall would study electrical and electronic engineering at university. His keen interest paid off in his third year when he won the Joe Gantly Prize in Engineering for best placement project for his work on a high-sensitivity capacitive touch sensor at Infineon Technologies.
For his final-year project, under the supervision of Dr Daniel O’Hare, he designed a low-power analogue front end for a fluorescence system used to detect skin cancers. The project won 3rd-level Project of the Year at the Microelectronic Design Association of Ireland.
He graduated from University College Cork (UCC) with a bachelor’s degree in 2018. His PhD research has already won him a Wrixon Research Excellence award, a recently established prize to recognise and support Tyndall-based postgraduate students.
‘I was exposed to many flavours of electrical engineering, but analogue electronics quickly became my passion’
Tell us about your current research.
After graduating in 2018, I felt I needed to research analogue-to-digital converters (ADCs) more before entering the fast-paced, chip-a-year consumer electronics world. The company that sponsored me agreed to fund my PhD at the Microelectronic Circuits Centre Ireland (MCCI) at the Tyndall National Institute, and gave my supervisor, Dr Daniel O’Hare, and I, free reign to investigate novel types of ADC.
The word ‘analogue’ in ADC means an electrical analogue of some physical world signal, for example, a temperature sensor senses room temperature and converts that signal to an electrical signal. That electrical signal is the analogue of the temperature, but it can now be processed by electronics.
Traditionally, when someone refers to an ADC, they are referring to a device that converts a voltage signal to digital. There are many electrical analogues though: current, resistance, capacitance etc.
Using the analogy of water flowing in a pipe, you can think of the voltage as the pressure pushing the water through the pipe and the current is the quantity of water flowing through it. Nowadays, we are seeing more sensors which convert physical phenomena not to a voltage, but to a current, for example electrochemical tests (such as those for Covid-19) and light detectors (which can be used from telecoms to cancer detection). For this reason, we decided to focus our research on small, power-efficient current-to-digital converters.
If you were asked to measure how much water (current) is flowing out of a pipe, you might put a bucket of a known size under the end of the pipe, fill the bucket, dump the water and measure how many times the bucket is filled in a minute. From that, you can calculate how much water is flowing in the pipe. This is the exact method we use to measure electronic current signals also, but instead of water molecules flowing in a pipe, we measure the flow of electrical charge in a wire, and instead of a bucket, we use a capacitor, which is akin to a bucket for electrical charge.
You would get tired dumping buckets all day though, and just as you could automate the process, we use circuitry to automatically dump the capacitor when we detect that it has filled.
My research focuses on ensuring the accuracy (linearity) of the measurement. You can imagine that dumping the water from the bucket takes time, and that affects the number of times the bucket can fill in a minute. The core of my research has been to make the capacitor charge dump as fast and repeatable as possible so that the amount of time wasted dumping the charge is well known and can be subtracted from the overall filling time. This results in a much more accurate and low-power current sensor.
‘We saw a reflection of what can happen when the formal research community doesn’t adequately communicate with the curious public’
Instead of a five-gallon bucket, all of this is happening on the microscale of a silicon CMOS chip, with a width of less than a human hair (or 20 micron x 20 micron) while burning 100 times less power than a small LED indicator lamp. We can measure down to 1 Nanoampere (nA) of current, 1m times each second, allowing the detection of very fast dynamics in electrochemical reactions, for example.
In your opinion, why is your research important?
As we want to process more and more data from the world around us, it’s critically important that we can get that data from the real world into our devices in the first place. As an example, consider the sensor suite on each smartphone: touch sensors, cameras, microphones, pressure sensors, and more. It is necessary to digitise the electrical signal from each of these sensors, and doing so in a power-efficient and compact manner has enabled the technology each of us has in our pockets. The trend of sensing more of our world at the edge continues, with ADCs and analogue processing chips being key enabling technologies for this.
What inspired you to become a researcher?
Throughout my time in UCC, I was exposed to many flavours of electrical engineering, but analogue electronics quickly became my passion. I loved the idea of manipulating voltages and currents to perform whatever task you’re trying to achieve, whether that be distorting sound for a guitar pedal or detecting the presence of human touch on a touch button.
During my undergraduate degree, I got work experience designing touch sensors where I developed a passion for taking real-world signals and converting them to digital signals which can be processed by our devices – analogue-to-digital conversion (ADC).
What are some of the biggest challenges or misconceptions you face as a researcher in your field?
Several years ago, we all threw out our analogue televisions for superior digital ones. We speak of the digital age, the digital transformation etc. This gives people the impression that analogue electronics is passé, and that digital is the way forward.
In recent years, there has been huge interest in artificial intelligence and machine learning for processing massive datasets, and many young researchers have been attracted to this field of study. However, it’s important to remember that without analogue signal processing and ADCs specifically, much of the data used by these AI models would be unavailable.
Furthermore, there has been renewed interest in analogue signal processing as a part of the AI revolution as analogue circuits more closely resemble the brain in the way that they can process signals.
Do you think public engagement with science has changed in recent years?
I think the general level of public engagement with science and engineering has increased, which is unquestionably a good thing. It was fascinating to see at the beginning of the Covid-19 pandemic that everyone became an epidemiologist overnight, everyone having theories and ideas of how the situation might play out.
That said, I think we saw a reflection of what can happen when the formal research community doesn’t adequately communicate with the curious public. I think of the theories around the introduction of 5G and linking it with the pandemic. It’s easy as a researcher to dismiss crackpot theories, but at the same time, I do think some of the blame lies on us for not adequately engaging with the public on what it is that we do, and that it isn’t magical or evil.
I know some people who have done circuits for 5G, they’re sound. I fill electronic buckets for a living. None of us are particularly evil – I just think we need to communicate that better.
10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.