Will “Space 2.0″ Mean Mobile Devices Talking Directly to Satellites?

April 28th, 2015

An editorial in the March issue of IEEE Communications argues satellites will become a key part of the internet, not just for broadcasting or patching coverage holes. Geostationary (GEO) satellites acting as repeaters are no long competitive. Low-Earth-Orbit (LEO) satellites, which can provide bi-directional traffic with lower latency are economical alternatives to building up terrestrial infrastructure in rural locations.

I asked one of the authors, Kul Bhasin, about this.  In recent years, he says the business of high-throughput GEO satellites has grown. He thinks we’re at a turning point where LEO satellites become favorable because of falling launch costs.

I asked him if this means eventually mobile devices will connect directly to LEO satellites. He said he is confident it will happen eventually: “LEO Satellite constellations will connect with tablet and/or phone, however I do not know when.” He says they will make use of phased arrays and beamforming to provide the gain required to overcome that path loss using reasonable output power.

I recall someone doing a demonstration at a hamfest 20 years ago, talking to a satellite in LEO with a yagi antenna connected to a 2W handheld transmitter. He had to adjust the frequency a little to correct for the doppler shift as the satellite approached and the moved past us. It seemed amazing he could talk to a satellite with an antenna only 1.5m long pointed in just the right direction.

Phase array antennas now offer the amazing possibility of transmitting wideband (compared to the 10kHz FM signal I saw in the demo) signals to a satellite with an antenna built into a mobile device’s enclosure.

A Look Back at Wi-Fi and to the Future of MIMO in Wi-Fi

April 4th, 2015

The articles about the future of Wi-Fi in last month’s IEEE Communications Magazine have me thinking about my experience with the history of Wi-Fi.

802.11(b) - This was the standard when I started doing wireless projects in the early 00s. (b) uses the 2.4 GHz band. Maximum raw data rate is 11Mbps.

802.11(g) came along in the mid 00s. It used OFDM and higher-order modulation to achieve 54Mbps data rate with same bandwidth (20 MHz) as (b).

When my colleague and mentor Jim Weikert and I tested the first cards, we spent days trying to work out why there were so many packets re-sent because the receiver did not acknowledge the packet. We later realized that those higher-order modulations require a very good signal-to-noise ratio (SNR) to maintain a low bit error rate (BER). The fallback algorithm would go to a rate that had a 10% packet error rate (PER) We were used to 0% PER with 802.11(b). The reason is your overall throughput is great at, say, 36 Mbps data rate with a 10% PER than it would be if it fell back to 24Mbps or 18Mbps to attain a lower PER. Complicating matters, OFDM requires excellent amplifier linearity, so most transmitters must reduce their output power to send the higher data rates.

802.11(a) came along around the same time. It’s just like (g) but using the 5-6 GHz ISM bands.

802.11(n) became the commonly-used standard around 2010. It supports 40 MHz wide bandwidths, slightly more than doubling the number of OFDM subcarriers, making it like (g) but times two. It also supported multi-stream MIMO. MIMO is the most amazing technology I have worked with. MIMO allows different streams of data to be sent from different antennas on the same frequency at the same time. The receiving side has the number-crunching power to detangle the cacophony of interfering signals coming in on its antennas using the model of the channel function between each transmitter and receiver.

In my experience the 40 MHz channels were an instant leap forward, doubling throughput, at the price of only 3 dB of link margin. Multi-stream MIMO was a technical marvel but of less practical benefit. It works flawlessly in the ideal case of connecting each transmitter and receiver with separate cables and attenuators. In a practical scenario, it works best if the antennas a separated at least a few percent of the distance between the transmitter and receiver and/or have different polarization angles. If the paths between the array of antennas are too similar, multi-stream MIMO won’t work or it may work but only at low data rates. Working out whether to do single-stream at a high data rate or multistream at a lower rate is not trivial.

802.11(ac) supports 80 MHz and 160 MHz channel and more MIMO options. One of the papers in IEEE Communications discussed MIMO techniques in (ac) and future Wi-Fi standards. I told the authors of my experience with MIMO being difficult to use in practical scenarios. I asked if it was practical for handheld devices, for which it’s hard to separate the antennas. Dr. Joonsuk Kim explained that we are beginning to see portable devices using MIMO. It has improved since the first (n) cards a few years ago. There is such demand for high-throughput, throughputs hard to attain simply by increasing the number of subcarriers. I mentioned problems I had seen using MIMO in a multi-path rich environment with large delay spreads. Dr. Inkyu Lee said we could always decrease the symbol rate and increase the number of subcarriers, although that may never be part of the Wi-Fi standard. The guard interval between Wi-Fi symbols is 800ns with an optional 400ns for slightly higher throughput.

Below is a graph showing how much throughput is possible with 802.11(ac) for various numbers of streams.  It all depends, of course, on the antennas being separated well and the channel functions between each TX and RX antenna being different enough for the receiver to recover the individual streams.

Conclusion
I now accept the 10% retry rate that Jim Weikert and I first observed 10 years ago with 802.11(g) as a normal fact of Wi-Fi. The challenge for Wi-Fi products now is to choose when to use more carriers (wider channels) and when to use MIMO. The challenge of picking the right modulation coding scheme (MCS) and switching dynamically as channel conditions change will become a fact of life like a 10%-15% retry rate has been for the past decade.

Serial Decoding Comes in Handy

April 2nd, 2015

Last week we were testing a board that has a CPU configuring a video chip over I2C. We saw data and clock and that the video chip was not responding. I suggested with we see if something was wrong with the video chip. My colleague, Erik Belanger, reminded me we had the EPO-4EMBD serial data module installed in the Tektronix scope we were using, so we could decode the data by pressing a button. Once we saw the data decoded, we realized the problem was in the address of the video chip.

We changed a strapping resistor, and the video chip started responding.

We would have realized this eventually, but it’s nice to have decoding capability built into my scope. Having it on a Tektronix scope gives the same functionality I got when I used my cheap MSO-19 scope years ago, but it’s much more reliable.  There’s nothing amazing anymore about having serial decoding built into your scope.  It’s very handy, though, to have the capability without getting out another piece of equipment.

Disrupting Survey Markers

March 24th, 2015

Today at Wisconsin Innovation Network meeting Mike Klonsinski of Bernsten spoke to the group about how SOLOMO, a Madison-based company focused on location analytics and mobile engagement, is helping them manage survey marker data from the RFID tags they are adding to their survey markers.

Liz Eversoll and Mike Klonsinski at WIN

An interesting fact I learned is that the ground shifts, so you cannot rely on GPS coordinates to set boundaries. Bernsten’s metal markers buried in the ground are the final arbiter of border boundaries, including the boundaries between states and between the the US and Canada. If you somehow move them, he joked, it would be a federal crime, but it would also create a legal issue because those markers are the official boundary.

At one point Klonsinski mentioned how not everyone at his company immediately embraced adding technology to their markers. Klonsinski said they didn’t want to wait until they started losing customers. This made me think of the response to disruptive technologies described in Innovators Dilemma. According to the path described in Innovators Dilemma, RFID-based survey markers would have at first taken the most price sensitive customers, and once more customers began moving it would have been too late to adopt a new technology.  Bernsten is avoiding

The talk was a nice reminder that even in this day and age we still need to physically make things and technology can make them better.

Anthropomorphic Appearance May Be a Trend in Robotics

March 23rd, 2015

I always thought anthropomorphic robots as being for toys or science fiction. The Sawyer robot from Rethink Robotics (see the recent article in IEEE Spectrum) makes me wonder if the trend in robotics will be toward anthropomorphism, similar to the trend of increaess user-friendliness in computers.

Computer applications in the 80s came with a book. You had to read most of it to learn the keystroke sequences to use the program effectively. When graphical user interfaces became common, applications acquired clusters of buttons so a user with no experience with it could begin working without reading the manual. Eventually the menus became organized in a fashion that didn’t take up too much screen space yet still allowed users to run the software with no training.

We may see this same pattern in robotics. One of the improvements of Sawyer over its predecessor, Baxter, is it can fit easily into an area normally occupied by a human being. They are marketing it to low-cost regions where human-labor-intensive tasks are often done, apparently with the notion that the robot can replace some difficult-to-automate tasks.

If the the trend toward anthropomorphic robots continues, first we’ll see robots that are somewhat anthropomorphic and can be trained simply my moving their hands to teach them the task. At first this may be inferior to robots that require programming to learn new tasks. Eventually, human-like learning gets good enough that becomes the primary way people interact with robots.

Testing Nodic nRF51822 BLE SoC

October 12th, 2013

I have had a chance to test Nordic Semiconductor’s nRF51822 Bluetooth Low Energy system on a chip last week.  It reminds me of a TI CC2540 but with an ARM. Thoughts and test results in my nRF51822 post over on e14.

A $1126 Mistake

October 6th, 2012

For the past few months I rented a VNA from TRS Rentelco.  It was a nice $14,000 VNA, and they only charged $600/mo plus tax.  At the same time, I rented an HP 85033D cal kit.  I could have made my own, but I decided to get a nice one for $100/mo.

Eventually I made my own and tested it against the HP cal kit.  It worked.  So I stopped renting it and returned it.  A month later TRS Rentelco showed me how I had done $1126 in damage to the open portion of.  They sent me a picture showing the damage.

It sounded bogus.  It could have been that way when I got it.  I only mated it a few times with a factory-made cable.  It worked fine before I sent it back.  This all sounded bogus so I called my trusted friend and RF expert, Robb Peebles of LSR, to ask if this was a bunch of baloney.

Before I was done explaining what they were charging me for he said, “Was it on the open?  The opens are fragile.  I did you rotate the 3.5mm socket with respect to center pin on the mating connector.”  He went on to explain what TRS Rentelco explained: 3.5mm connectors are similar but not exactly the same as SMAs.  3.5mm are higher end and very fragile, especially if they’re the open on a cal kit.

So I guess TRS Rentelco was right.  I must have done damage to the open when I connected it to my cable.  I must have turned the open with respect to the mating connector instead of pushing them together and tightening the screw on the outside of the SMA.

So by turning something gently but in the wrong way, I did $1126 worth of damage to a part whose function is to do nothing.  (well, nothing at low frequencies anyway)  At the frequencies I was working at (1GHz), I could have probably done as well not using the open and just leaving the SMA unterminated and saved $1126.

Be careful with 3.5mm connectors.

CC430′s Core Is Powered By Built-In Switcher

January 27th, 2012

I have been working on some boards with TI CC430 processor + radio chips.  It amazes me how much is crammed into that part.  It includes a complete multiband transceiver with 10mW output.

The radio portion runs on 2V and a built-in linear regulator allows you to power it with 2.1 to 3.6V.  Since it’s a linear regulator, the current is the same regardless of the input voltage, so it pays to use the lowest input voltage possible.

Earlier this week I was working out a scheme to do this for the radio portion and the processor portion.  I thought I could save power by reducing the voltage to the processor, but it turns out the CC430 chip has a built-in inductorless switching regulator to power the processor core.  So if you lower the input voltage, the current increases leaving the power roughly constant.  This means the engineer using this part can use any supply within the allowed range without it affecting power consumption.

What more could they do to the CC430?  Could the next step be a switching power supply for the radio portion too?

Politically Motivated Attacks on STEM Workers

January 15th, 2012

Iranian scientist Mostafa Ahmadi Roshan was murdered apparently for his work on the Iranian nuclear program.  It is unclear whether it was an foreign covert operation.  It is possible the Iranian government is responsible for the attack intended to put the blame on Israel.  It was very unfortunate that an Israeli military spokesperson said he is “definitely not shedding a tear” over the murdered scientist.

This attitude of tolerance of violence toward scientists and engineers working on controversial project is completely unacceptable.  US should reevaluate its support for the Israeli government if there is evidence it condones this type of attack.

Iran claims its nuclear program is for peaceful purposes, but it is likely its goal is development of weapons.  It is understandable that the governments that have nuclear weapons want to keep their weapons but keep others from getting them.  This creates a perverse incentive for governments to want to join the nuclear weapons club.

Regardless of the politics, scientists and engineers should never be targets of violence for the programs they work on.  Eight years ago I casually talked to someone from the US nuclear weapons program about work for them.  They assured me the job was to control access to existing weapons not to develop new ones.  I suspect that was correct, although  found a different job and never learned any more about the nuclear job.  If I had worked on the project, foreign powers might have felt like I was military target, but they would be wrong.  Engineers and scientists have a responsibility to limit their work projects they believe are moral and ethical.  They should never be part of  geopolitical power games.

Photo from Fars News Agency/European Pressphoto Agency

Skeptical Toward “Neurological Conditions”

December 29th, 2011

The New York Times had a good article this week about two teenagers who are identified as being on the “autism spectrum”: Navigating Love and Autism.

The article was well written and held my interest, but it is really surprising that a case of geek love merits so much discussion.  It tells us how they “have trouble reading emotions and gauging social cues that others take for granted.”  It makes me  think of people who have trouble understanding the mathematical models and control loops in nature that others take for granted.

Consider some basic everyday mathematical abilities that some people lack:

  • Estimating distances, weights, and volumes within an order of magnitude
  • Distinguishing between open loop systems like a typical stove from close-loop systems like a typical oven
  • Taking into account that the coefficient of skidding friction on a slick road is significantly less than the coefficient of static friction
  • Ability to evaluate financial products such as mortgages or insurance agreements
  • Awareness that the levels of most medicines in the body decays roughly exponentially
  • Distinguishing correlation from causation

It’s easy to dismiss the lack of these abilities as idiocy.  It would be easy, although I’ve never heard it done in a mainstream article, to dismiss them as a neurological condition and write a cute article about the relationship troubles these foibles cause.

If we’re going to do label all idiosyncratic personality traits as conditions, hard-core salesmakers and politicians need a label just as much as hard-core engineers.  Probably everyone who’s hard-core about anything and goes out and changes the world could get a neurological label.  We’re supposed to have moved beyond this sort of thing.  We need to be skeptical of “neurological conditions” whose only symptom is minor odd behavior.