How to choose between analog signal processing (ASP) and digital signal processing (DSP). How too chose for ASP or DSP; analog filters or digital filters?
Posts
Drones and DC motor control – How the ASN Filter Designer can save you a lot of time and effort
Drones are one of the golden nuggets in IoT. No wonder, they can play a pivotal role in congested cities and far away areas for delivery. Further, they can be a great help to give an overview of a large area or places which are difficult or dangerous to reach. However, most of the technology is still in its experimental stage.
Because drones have a lot of sensors, Advanced Solutions Nederland did some research on how drone producing companies have solved questions regarding their sensor technology, especially regarding DC motor control.
Until now: solutions developed with great difficulty
We found out that most producers spend weeks or even months on finding solutions for their sensor technology challenges. With the ASN Filter Designer, he/she could have come to a solution within days or maybe even hours. Besides, we expect that the measurement would be better too.
The biggest time coster is that until now algorithms were developed by handwork, i.e. they were developed in a lab environment and then tested in real-life. With the result of the test, the algorithm would be tweaked again until the desired results were reached. However, yet another challenge stems from the fact that a lab environment is where testing conditions are stable, so it’s very hard to make models work in real life. These steps result in rounds and rounds of ‘lab development’ and ‘real life testing’ in order to make any progress -which isn’t ideal!
How the ASN Filter Designer can help save a lot of time and effort
The ASN Filter Designer can help a lot of time in the design and testing of algorithms in the following ways:
- Design, analyse and implement filters for drone sensor applications with real-time feedback and our powerful signal analyser.
- Design filters for speed and positioning control for sensorless BLDC (brushless DC) motor applications.
- Speed up deployment to Arm Cortex-M embedded processors.
Real-time feedback and powerful signal analyser
One of the key benefits of the ASN Filter Designer and signal analyser is that it gives real-time feedback. Once an algorithm is developed, it can easily be tested on real-life data. To analyse the real-life data, the ASN Filter Designer has a powerful signal analyser in place.
Design and analyse filters the easy way
You can easily design, analyse and implement filters for a variety of drone sensor applications, including: loadcells, strain gauges, torque, pressure, temperature, vibration, and ultrasonic sensors and assess their dynamic performance in real-time for a variety of input conditions. With the ASN Filter Designer, you don’t have do to any coding yourself or break your head with specifications: you just have to draw the filter magnitude specification and the tool will calculate the coefficients itself.
Speed up deployment
Perform detailed time/frequency analysis on captured test datasets and fine-tune your design. Our Arm CMSIS-DSP and C/C++ code generators and software frameworks speed up deployment to a DSP, FPGA or micro-controller.
An example: designing BLDC motor control algorithms
BLDC (brushless DC) BLDC motors have found use in a variety of application areas, including: robotics, drones and cars. They have significant advantages over brushed DC motors and induction motors, such as: better speed-torque characteristics, high reliability, longer operating life, noiseless operation, and reduction of electromagnetic interference (EMI).
One advantage of BLDC motor control compared to standard DC motors is that the motor’s speed can be controlled very accurately using six-step commutation, making it a good choice for precision motion applications, such as robotics and drones.
Sensorless back-EMF and digital filtering
For most applications, monitoring of the back-EMF (back-electromotive force) signal of the unexcited phase winding is easier said than done, since it has significant noise distortion from PWM (pulse width modulation) commutation from the other energised windings. The coupling between the motor parameters, especially inductances, can induce ripple in the back-EMF signal that is synchronous with the PWM commutation. As a consequence, this induced ripple on the back EMF signal leads to faulty commutation. Thus, the measurement challenge is how to accurately measure the zero-crossings of the back-EMF signal in the presence of PWM signals?
A standard solution is to use digital filtering, i.e. IIR, FIR or even a median (majority) filter. However, the challenge for most designers is how to find the best filter type and optimal filter specification for the motor under consideration.
The solution
The ASN Filter Designer allows engineers to work on speed and position sensorless BLDC motor control applications based on back-EMF filtering to easily experiment and see the filtering results on captured test datasets in real-time for various IIR, FIR and median (majority filtering) digital filtering schemes. The tool’s signal analyser implements a robust zero-crossings detector, allowing engineers to evaluate and fine-tune a complete sensorless BLDC control algorithm quickly and simply.
So, if you have a measurement problem, ask yourself:
Can I save time and money, and reduce the headache of design and implementation with an investment in new tooling?
Our licensing solutions start from just 125 EUR for a 3-month licence.
Find out what we can do for you, and learn more by visiting the ASN Filter Designer’s product homepage.
Generations have been entertained by the gadgets and future technology portrayed in Sci-fi series, such as Star Trek, but is it all science fiction?
The Tricorder
One device that intrigued me for years was the so called ‘Tricorder’ and the ability for the doctor to read a person’s vital life signs or VLS (e.g. heart beat and respiration) with a handheld device from about a metre away.
Back to the 21st century!
With advances in radar technology over the last few years, a few chip manufacturers are now producing affordable radar devices suitable for biomedical VLS measurement. Radar technology that used to cost thousands of Euros, and was primarily aimed a military technology, is now available for a few hundred Euros, making it viable for home medical products.
Sounds great, but what can UWB (ultrawide band) pulse doppler radar do?
- millimetre accuracy: allowing for detection of the smallest changes, such as respiration and heart rate from several metres away.
- Penetrates duvets, blankets and clothes: The virtue of the small wavelength ensures accurate detection of humans in a bed or sitting in chair watching TV or reading a book.
- Penetrates Walls and doors: Tracking of VLS and movement when radar is mounted within a ceiling or behind a wall – no ugly module on the wall!
- High sensitivity: able to see the VLS from tiny premature babies.
- Ultra-safe technology: RF emission is 0.01% the energy typically found in a household WiFi router – meaning that prolonged exposure will have no detrimental effects on human health.
After receiving a request from a client about monitoring the health of an eldery person living alone or in a nursing home, we decided to conduct a few tests of our own to see what was possible with this new technology.
VLS of a subject lying in bed
After building a demonstrator, and placing the radar sensor about 1 metre from a subject sleeping (similar to the doctor in Star Trek), we obtained the following waveform:
Captured VLS data captured from a UWB radar – containing both heart beat and respiration information
Wow! Was our initial reaction to the test data – this is millimetre movement through a duvet! Notice how slow the biomedical signal is, as an average adult’s respiration rate at rest is about 12bpm, requiring relatively long data acquisition times (tens of seconds) for meaningful data analysis.
What’s the respiration and heart beat?
Passing the signal through our algorithm, we could easily estimate the respiration (RR) and heartbeat (HB) from the plot (see the two red squares on the two peaks). However, in order to be objective, we attached a clip-on pulse oximeter to the subject finger, and as seen they matched very well.
What does this all mean for me?
Contactless VLS measurement for home use is closer than you think, and is certainly not science-fiction anymore. This technology opens up many possibilities for monitoring when normal sensors are infeasible, such as premature babies, patients with dementia and even sleep trend analysis. We’ll improve our algorithm in order to make it more robust and faster, but as seen our results are very promising indeed, and open up the possibility of contactless vital life signs (VLS) measurements for many practical applications!
The internet of things (IoT) has gained tremendous popularity over the last few years, as many organisations strive to add IoT smart sensor technologies to their product portfolios. The basic paradigm centres around connecting everything to everything, and exchanging all data. This could be house hold appliances to more blue sky applications, such as smart cities. But what does this particularly mean for you?
Almost all IoT applications involve the use of sensors. But how do SME and even multi-national organisations transform their legacy product offering into a 21st century IoT application? One the first challenges that many organisations face is how to migrate to an IoT application while balancing design time, time to market, budget and risk.
Sounds interesting? Then read further….
We recently completed a project for a client who manufactured their own sensors, but wanted to improve their sensor measurement accuracy from ±10% to better than ±0.5% without going down the road of a massive re-design project.
The question that they asked us was simply: “Is it possible to get high measurement accuracy performance from a signal that is corrupted with all kinds of interference components without a hardware re-design?”
Our answer: “Yes, but the winning recipe centres around knowing what architectural building blocks to use”.
Traditionally, many design bureaus will evaluate the sensor performance and try and improve the measurement accuracy performance by designing new hardware and adding a few standard basic filtering algorithms to the software. This sort of intuitive approach can lead to very high development costs for only a modest increase in sensor performance. For many SMEs these costs can’t be justified, but perhaps there’s a better way?
Algorithms: the winning recipe
Algorithms and mathematics are usually regarded by many organisations as ‘academic black magic’ and are generally overlooked as a solution for a robust IoT commercial application. As a consequence very few organisations actually take the time to analytically analyse a sensor measurement problem, and those who do invent something tend to come up with something that’s only useable in the lab. There has been a trend over the years to turn to Universities or research institutes, but once again the results are generally too academic and are based more on getting journal publications, rather than a robust solution suitable for the market.
Our experience has been that the winning recipe centres around the balance of knowing what architectural blocks to use, and having the experience to assess what components to filter out and what components to enhance. In some cases, this may even involve some minor modifications to the hardware in order to simplify the algorithmic solution. Unfortunately, due to the lack of investment in commercially experienced, academically strong (Masters, PhD) algorithm developers and the pressure of getting a project to the finish line, many solutions (even from reputable multi-national organisations) that we’ve seen over the years only result in a moderate increase in performance.
Despite the plethora of commercially available data analysis software, many organisations opt to do basic data analysis in Microsoft Excel, and tend to stay away from any detailed data analysis as it’s considered an unnecessary academic step that doesn’t really add any value. This missed opportunity generally leads to problems in the future, where products need to be recalled for a ‘round of patchwork’ in order solve the so called ‘unforeseen problems’. A second disadvantage is that performance of the sensors may be only satisfactory, whereas a more detailed look may have yielded clues on how make the sensor performance good or in some cases even excellent.
Algorithms can save the day!
“Although many organisations regard data analysis as a waste of money, our experience and customers prove otherwise.”
Investing in detailed data analysis at the beginning of a project usually results in some good clues as to what needs to be filtered out and what needs to be enhanced in order to achieve the desired performance. In many cases, these valuable clues allow experienced algorithm developers to concoct a combination of signal processing building blocks without re-designing any hardware – which is very desirable for many organisations! Our experience has shown that this fundamental first step can cut project development costs by as much as 75%, while at the same time achieving the desired smart sensor measurement performance demanded by the market.
So what does this all mean in the real world?
Returning the story of our customer, after undertaking a detailed data analysis of their sensor data, our developers were able design a suitable algorithm achieving a ±0.1% measurement accuracy from the original ±10% with only minor modifications to the hardware. This enabled the customer to present his IoT application at a trade show and go into production on time, and yes, we stayed within budget!
Author
-
Sanjeev is an AIoT visionary and expert in signals and systems with a track record of successfully developing over 25 commercial products. He is a Distinguished Arm Ambassador and advises top international blue chip companies on their AIoT solutions and strategies for I4.0, telemedicine, smart healthcare, smart grids and smart buildings.
View all posts
The internet of things (IoT) devices have been around for a number of years now, but very few smart sensors have any decent level of data security. For many organisations the issue of data security and secure remote updates to legacy products has become of paramount importance. Unfortunately, many of the engineers who design sensor products have little or no understanding a security algorithms, leading to systems that can be easily hacked – the fiasco of the UK smart meter system is a good example.
Algorithms to the rescue
Algorithms and mathematics are usually regarded by many organisations as ‘academic black magic’ and are generally overlooked as a solution for a robust IoT commercial application. Nevertheless, some of you may be surprised by how old the concept of algorithms actually are in solving real world problems.
A few weeks ago, I looked through my old PhD thesis and stumbled across a reference to one of world’s first documented algorithms from the 9th century mathematician, Al-Khwarizmi (where, the word ‘algorithm’ is derived from al-Khwarizmi’s name).
Al-Khwarizmi undertook pioneering work in algebra, which was popularized in his book, “al-Mukhtasar fi Hisab al-Jabr wa l-Muqabala” and altered society’s perspective of analyzing problems, be they a simple domestic chore or a complex mathematical concept.
An excerpt from “Al-Mukhtasar fi Hisab al-Jabr wa l-Muqabala” for the solution to x^2 + 10x = 39.
Translation: For the squares and roots equal to a number, it is as saying: a square and ten of its roots is equal to thirty-nine dirhams. The solution is to halve roots, equal to five in this problem, then, multiplying the root by itself then this will be twenty-five. Then add it to thirty-nine and this will be sixty-four. Then take the square root, which will be eight and subtract from it half the root, which is five. The remainder is three and that is the root you are seeking and the square is nine.
I had forgotten (well, it was 14 years ago!) how elegant Al-Khwarizmi work actually was, and how I’m sure he would probably smile at the challenges that we’re facing today. Nevertheless, without his pioneering work, we wouldn’t have any of the IoT and security algorithms that we take for granted today.
Solutions in the 21st century
We’ve been pleasantly surprised by the rich offering from commercial IC vendors, such as: Atmel, NXP and Analog Devices in producing secure micro-controllers for the IoT market. Many of these micro-controllers include all of the necessary hardware encryption building blocks needed for building a secure IoT sensor, and some even offer a decent amount of processor power for data analytics algorithms.
Sounds ideal, right?
The Achilles heel of all of these solutions is how engineers implement them in a system. The micro-controller itself may be ‘secure’, but what about the system architecture (i.e. the algorithmic building blocks and and how they interact with each other). And what about encryption keys? How are they stored and updated? For the UK smart meter system mentioned above, the system just used one key for the whole system – not very secure ! It is this fact that is painfully overlooked by many, and as such, which eventually leads to the system being hacked and rendered useless.
In short, hardware based encryption technology is a great step in right direction for IoT device security, but without good understanding of encryption technology as part of the system architecture the solution is doomed to failure.
Author
-
Sanjeev is an AIoT visionary and expert in signals and systems with a track record of successfully developing over 25 commercial products. He is a Distinguished Arm Ambassador and advises top international blue chip companies on their AIoT solutions and strategies for I4.0, telemedicine, smart healthcare, smart grids and smart buildings.
View all posts
It’s estimated that the global smart sensor market will have over 50 billion smart devices in 2020. At least 80% of these IoT/IIoT smart sensors (temperature, pressure, gas, image, motion, loadcells) will use Arm’s Cortex-M technology – where the largest growth is in smart Image sensors (ADAS) & smart Temperature sensors (HVAC).
IoT sensor measurement challenge
The challenge for most, is that many sensors used in these applications require a little bit of filtering in order to clean the measurement data in order to make it useful for analysis.
Let’s have a look at what sensor data really is…. All sensors produce measurement data. These measurement data contain two types of components:
- Wanted components, i.e. information what we want to know
- Unwanted components, measurement noise, 50/60Hz powerline interference, glitches etc – what we don’t want to know
Unwanted components degrade system performance and need to be removed.
So, how do we do it?
DSP means Digital Signal Processing and is a mathematical recipe (algorithm) that can be applied to IoT sensor measurement data in order to clean it and make it useful for analysis.
But that’s not all! DSP algorithms can also help in analysing data, producing more accurate results for decision making with ML (machine learning). They can also improve overall system performance with existing hardware (no need to redesign your hardware – a massive cost saving!), and can reduce the data sent off to the cloud by pre-analysing data and only sending what is necessary.
Nevertheless, DSP has been considered by most to be a black art, limited only to those with a strong academic mathematical background. However, for many IoT/IIoT applications, DSP has been become a must in order to remain competitive and obtain high performance with relatively low cost hardware.
Do you have an example?
Consider the following application for gas sensor measurement (see the figure below). The requirement is to determine the amplitude of the sinusoid in order to get an estimate of gas concentration (bigger amplitude, more gas concentration etc). Analysing the figure, it is seen that the sinusoid is corrupted with measurement noise (shown in blue), and any estimate based on the blue signal will have a high degree of uncertainty about it – which is not very useful if getting an accurate reading of gas concentration!
Algorithms clean the sensor data
After ‘cleaning’ the sinusoid (red line) with a DSP filtering algorithm, we obtain a much more accurate and usable signal which helps us in estimating the amplitude/gas concentration. Notice how easy it is to determine the amplitude of red line.
This is only a snippet of what is possible with DSP algorithms for IoT/IIoT applications, but it should give you a good idea as to the possibilities of DSP.
How do I use this in my IoT application?
As mentioned at the beginning of this article, 80% of IoT smart sensor devices are deployed on Arm’s Cortex-M technology. The Arm Cortex-M4 is a very popular choice with hundreds of silicon vendors, as it offers DSP functionality traditionally found in more expensive DSPs. Arm and its partners provide developers with easy to use tooling and a free software framework (CMSIS-DSP) in order to get you up and running within minutes.
Author
-
Sanjeev is an AIoT visionary and expert in signals and systems with a track record of successfully developing over 25 commercial products. He is a Distinguished Arm Ambassador and advises top international blue chip companies on their AIoT solutions and strategies for I4.0, telemedicine, smart healthcare, smart grids and smart buildings.
View all posts
With the advent of smart cities, and society’s obsession of ‘being connected’, data networks have been overloaded with thousands of IoT sensors sending their data to the cloud, needing massive and very expensive computing resources to crunch the data.
Is it really a problem?
The collection of all these smaller IoT data streams (from smart sensors), has ironically resulted in a big data challenge for IT infrastructures in the cloud which need to process
massive datasets – as such there is no more room for scalability. The situation is further complicated with the fact, that a majority of sensor data is coming from remote locations, which also presents a massive security risk.
It’s estimated that the global smart sensor market will have over 50 billion smart devices in 2020. At least 80% of these IoT/IIoT smart sensors (temperature, pressure, gas, image, motion, loadcells) will use Arm’s Cortex-M technology, but have little or no smart data reduction or security implemented.
The current state of play
The modern IoT eco system problem is three-fold:
- Endpoint security
- Data reduction
- Data quality
Namely, how do we reduce our data that we send to the cloud, ensure that the data is genuine and how do ensure that our Endpoint (i.e. the IoT sensor) hasn’t been hacked?
The cloud is not infallible!
Traditionally, many system designers have thrown the problem over to the cloud. Data is sent from IoT sensors via a data network (Wifi, Bluetooth, LoRa etc) and is then encrypted in the cloud. Extra services in the cloud then perform data analysis in order to extract useful data.
So, what’s the problem then?
This model doesn’t take into account invalid sensor data. A simple example of this, could be glue failing on a temperature sensor, such that it’s not bonded to the motor or casing that it’s monitoring. The sensor will still give out temperature data, but it’s not valid for the application.
As for data reduction – the current model is ok for a few sensors, but when the network grows (as is the case with smart cities), the solution becomes untenable, as the cloud is overloaded with data that it needs to process.
No endpoint security, i.e. the sensor could be hacked, and the hacker could send fake data to the cloud, which will then be encrypted and passed onto the ML (machine learning) algorithm as genuine data.
What’s the solution?
Algorithms, algorithms….. and in built security blocks.
Over the last few years, hundreds of silicon vendors have been placing security IP blocks into their silicon together with a high performance Arm Cortex-M4 core. These so called enhanced micro-controllers offer designers a low cost and efficient solution for IoT systems for the foreseeable future.
A lot can be achieved by pre-filtering sensor data, checking it and only sending what is neccessary to the cloud. However, as with so many things, knowledge of security and algorithms are paramount for success.
When looking at undertaking a NPD (new product development) it’s always tempting to take short cuts in order make the development more attractive to management or a client. I’ve lost count as to the number of times that I’ve heard,
If we had the budget and time, of course we’d do it properly.
The problem with this, is that after doing this several times, it becomes the norm, and taking short cuts and in order to potentially save money usually leads to more problems in the long run.
Follow the yellow brick road
For those of you who remember the Wizard of Oz, following the yellow brick road led Dorothy to Emerald city. The NPD process is the same, and is nicely described on Wiki and many other great resources, so there’s no reason to ignore it. If you can’t get the information out of clients, propose a workshop or private session, where you can discuss all requirements and get a clear picture of the clients vision/expectations. Remember that before taking on any new development, you should undertake the task of defining specifications:
- User specifications: a document that specifies what the user(s) expects the product (e.g. software) to be able to do.
- Functional specifications: are requirements that define what a system is supposed to do. A functional specification does not define the inner workings of the proposed system, and does not include the technical details of how the system function will be implemented.
- Technical specifications: based on the Workshops, User and Functional requirements, you can finally construct the technical specifications documentation. This document(s) will contain all technical details of how the system will be implemented, and usually includes tables, equations and sketches of GUI layouts and hardware block diagrams.
- Review: the specifications and findings with the client, and make sure that they understand what they will be getting (i.e. the deliverables).
Although it’s tempting to ignore these steps and start playing with software, following the aforementioned steps keeps you focused and almost always leads to shorter development times.
About the author: Sanjeev Sarpal is director of algorithms and analytics at Advanced Solutions Nederland BV. He holds a PhD in signal processing and has over 20 years commercial experience with the design and deployment of algorithms for smart sensor applications.
Many hi-tech companies that we have spoken with over the years have struggled to clarify this question, and almost all say that they fall into the D category. Where many say,
we’re not a University.
However, upon closer examination, it would appear that many companies are actually engaged in applied research activities but don’t even realise it or don’t like advertising it. From our experience, research activities can be broken up into two distinct categories:
- Applied research
- Fundamental research
Fundamental research is where Universities and research institutes are primarily focused, with the sole purpose of advancing theoretical knowledge in that field via publications and conferences.
Applied research on the other hand, takes the fundamental research findings and applies them to real world application for commercial exploitation. Many companies are constantly on the lookout for new concepts and methods to put them in front of their competition. This could take the form of implementing a radically new scientific method, or even improving their internal processes in order to reduce waste and boost profits.
The latter (also applied research) is currently where many companies are focused (mainly due to the current financial situation), but for the companies who can afford it, there is still some room for the first point.
How do they do it?
Many companies generally hire academics to help them with the translation of these concepts into products, but experience has shown that academics are not the best choice as product developers, so companies tend to use a mix of skills sets, i.e. academic and non-academics in order to get to their products to the finish line.
Academics generally prove to be invaluable at analysing problems and processes and highlighting any areas of weakness, while at the same time taking a pivotal role in the definition of technical specifications.
What about culture and egos?
There are the ego and cultural aspects to consider too, which is especially prevalent with academics. The mismatch with the academic and commercial models unfortunately leads to many focusing on providing the best solution (the academic world), rather than focusing on what the client actually needs (the commercial world).
Many academics consider management as too dumb to understand their fantastic idea.
Good leadership and emotional intelligence are needed to keep the R&D team focused on the big picture, and avoiding the department from turning into a playground for hobby projects. Many engineers and academics struggle with the concept of knowing when a product is good enough, and keep on fiddling with it until it’s perfect.
From our experience, getting clear and realistic customer requirements and then coming up with a careful plan, led by experienced developers is the only way of reaching a successful conclusion.
R, D, or both ?
Whether your company falls into R or D or a mixture of both depends on your business model, but perhaps you actually do more R than you realise. In all cases, selection of a good team that works well together is paramount for success.
Author
-
Sanjeev is an AIoT visionary and expert in signals and systems with a track record of successfully developing over 25 commercial products. He is a Distinguished Arm Ambassador and advises top international blue chip companies on their AIoT solutions and strategies for I4.0, telemedicine, smart healthcare, smart grids and smart buildings.
View all posts
Advanced Solutions Nederland B.V.
Lipperkerstraat 146
751DD Enschede
The Netherlands
Tel: +31 652460840
General enquiries: info@advsolned.com
Technical support: support@advsolned.com
Sales enquiries: sales@advsolned.com