As always with radio, it's mostly about unobstructed line of sight and the gain in your antenna system.
We're still in communication with Voyager 1, which is operating on a grand total of about 20W of RF power; and is currently about 14.5 billion miles away.
At the receiver, you have "minimum detectable signal", MDS, measured in dBm.
At the transmitter, you have power out, measured in dBm. Add transmitting antenna gain, in dB, subtract propagation loss through medium(s), add receiving antenna gain, and if that number is greater than MDS, you win! The Really Great Science in Voyager is the added factor of "coding gain" -- sophisticated error correction codes can give you a many dB adder, at the expense of data rate (nobody cheats Claude Shannon).
It's also amazing considering it's using a computer built over 45 years ago from discrete components with only 70Kb of memory, while operating on a gradually failing thermonuclear power source. Voyager 1 also holds the world record for the longest continual operation of a computer:
Pretty much, although I think the RTG is mounted on a limb as far away from the other equipment as possible, but still the computer needs to be able to tolerate a decent amount of radiation. It also has an interesting redundant design where it has double the component count and can either exploit that for extra compute or redundancy in case of component failure.
Solar particles become less of a threat, but the sun's atmosphere actually shields the solar system from some types of cosmic rays. Don't know what proportion of electronics faults stem from each, though.
> ...the last of the project's original programmers, retired, and it was difficult to find a replacement with such in-depth knowledge of what now seem like ancient hardware and design principles
That makes me realize, it would actually be an incredibly fun job to be in the team that manages the simulation/emulation stack that people tinker with to learn about this thing.
Going on a lot of presumption here (and a bit of optimism...), I would presume "don't brick it" is backed up by an appropriate level of funding to cover the training people receive before they get to touch production. In an optimal scenario (don't be wrong... don't be wrong...) this would cover component-level electrically-accurate (because radiation) logic simulation that everything gets tested on first.
NASA doesn't mess around with simulators. I've read accounts from astronauts and ground controllers where they say their actual missions feel anticlimactic after all the insane things they had to do in the simulators. Gene Kranz managed to crash a simulated Apollo mission into a mountain.
The code on the computers has been upgraded, and even the language used to prpgram has changed over the years [1]. Which leads me to believe that the computer has been restarted/rebooted many times in all these years.
I think there may be other computers, right here on Earth, that may be able to at least compete given those criteria.
[1]: From the linked page:
> The CCS originally ran software written in the Fortran programming language, but this has been continually upgraded since its launch (software updates can be transmitted and installed remotely). The current software is written in a mixture of C and Fortran.
The following is fascinating.
> The age of the computer and its codebase has caused problems for NASA in recent years. In 2015 Larry Zottarelli, the last of the project's original programmers, retired, and it was difficult to find a replacement with such in-depth knowledge of what now seem like ancient hardware and design principles.
Hello. You've reached the Voyager I spacecraft. We can't answer your ping right now but if you'd like to leave a message, please, do so after the beep. We look forward to servicing your request as soon as possible.
I think an older probe also took the first digital pictures before bitmap images were a thing, they could digitise it at the probe end but had to colour in the pixels on a piece of paper at the other end if I recall correctly. Point is, NASA had to solve a lot of digital communication problems long before they had established terrestrial solutions or before computers were even capable of solving the whole problem on their own.
Then again, even if the timing was reversed, a lot of terrestrial communication protocols have no chance of working properly at the ranges required for space exploration - something I've been meaning to look into is the redundancy requirements over those long distances, since re-transmission is so costly in terms of latency that it probably makes sense to pack as much redundancy into the signal as possible, we already do this with terrestrial communication in the form of various codes, but there's a finer balance between increased bandwidth and the relatively low latency cost of re-transmission in the case of unrecoverable sequences... I'm guessing you want a far lower probability of loss for a 43 hour round trip.
As always with radio, it's mostly about unobstructed line of sight and the gain in your antenna system.
Indeed. If you grow up with your most common radio interactions being an FM car radio and a dumbphone, you get the impression it's entirely about range. Then you buy a drone and find out one pine needle shaves 50% off of your signal strength.
the mavic 2 series of drones by DJI don't have this issue, but the tradeoff is the controller weighs like 40 pounds, because it's a quite powerful wireless access point. Every other drone i've owned that supported video used an android or ios device to connect to the drone via wifi (drone presents as a WAP).
iirc the claimed range is around 8km on the one i have, about 5 miles. I have assuredly gone well over 2km with no issues with control or video feed. This was over a straight highway. I routinely fly around a kilometer away, and the only issues i have is if i launch from an extremely dense patch of pine trees, and only at about 800-900 meters, i will lose video (artifacting for a second), but not control. It's never had to RTH.
In case you're curious about city usage, i have a friend that has one he launches from a culdesac in Orange County and can fly in nearly any direction for about 8 minutes* before he hits a geofence, the drone still functions normally. If there is any issues, he can just fly higher.
The newest newest DJI stuff claims even more ridiculous range, 15km+ over open water, for instance.
If i hadn't used it myself, i wouldn't have believed it, it sounds like BS.
but the tradeoff is the controller weighs like 40 pounds, because it's a quite powerful wireless access point. Every other drone i've owned that supported video used an android or ios device to connect to the drone via wifi (drone presents as a WAP).
That is not accurate. The modified wi-fi some DJI drones use is not reliant on the smartphone/tablet attached to the controller. It's strictly between the drone and the controller, which passes data on to the phone via USB. The drones can be switched into AP mode for faster media downloads, but at that point they lose connection with the controller. Ocusync controllers weigh 390 grams, and that isn't that much considering their build quality and the fact they have two 18650 cells inside.
Thanks for sharing those figures. I've always known "it's amazing" and "it's far away", but those numbers really put things into perspective.
Is there anything particularly special about the antennae on the spaceship? They must be rigorously aligned to point at Earth, and even a slight knock would spoil everything? Or is it more resilient than that?
There is a beamwidth associated with a parabolic antenna's size and operating frequency. The attitude of a spacecraft is periodically corrected to keep earth within the beam's central lobe. It's approximate so corrections aren't required too frequently and become less of an issue as Voyager gets further away from earth's orbit around the sun. Once it's out of propellant it won't be able to correct for torque from the solar wind etc and the high gain antenna's central lobe will drift until its no longer pointing close enough to maintain an acceptable signal to noise ratio. Unless they can maintain contact through the omnidirectional antenna, Voyager will be effectively lost.
I would think the biggest part of being able to transmit that far is the perfect vacuum of space. There is almost nothing, not even air particles between us and the probe.
If you threw a beach ball from the distance of voyager straight to earth it would eventually make it here.
For some two-way wireless protocols (like wifi) you have to take into account the guard interval, slot times and interframe spacing which are all values set in time (~1-50us). For long distance transmissions your speed-of-light limited signal propagation time can exceed these values.
In terms of size usually guard interval < slot size < inter-frame space. If propagation exceeds guard interval AND have a channel with lots of echo any communication will be difficult. If propagation exceeds slot timing then coordination between more than 2 devices will be different (high retries/low throughput). If propagation exceeds interframe spacing a two-way wifi connection will not be possible as both stations will think every frame timed out waiting for an ACK.
It's also handy when you only need a low data rate and can make your channel bandwidth was wide as you like without worrying about licensing restrictions.
Yes, radio waves are attenuated in the atmosphere. This is highly frequency dependent - for practical applications the lower the frequency, the less radio waves are attenuated. In comparison with attenuation from obstacles in non-line-of-sight situations, the atmospheric component is not significant.
For really long range propagation on earth, reflections on atmospheric layers are the dominant factor (as there is no line of sights due to the curvature of the planetary surface).
Does that mean that there is an electromagnetic frequency that, no matter how much power we can feasibly put into it, it simply will not transmit through earth's atmosphere due to its' instantaneous attenuation outside of a vacuum?
Yes, in the terahertz band range for example you have very high attenuation of more than 1000 db/km. It's a spectrum (no pun intended), so depending on where you draw the line on "through earth's atmosphere" in terms of how far you want to transmit, you'll have some frequency where this becomes infeasible or you'll need more directed antennas and/or more transmit power.
Atmosphere does play a part but just freespace losses are going to be massive, probably at least 250dB just for the distance voyager is at. Atmosphere could add another 60-100dB. Voyager's antenna has about 40dB of gain but the DSN network can array multiple antennas to make up for the losses. They have up to 2 70m dishes and several 34m dishes that can point at Voyager, quite a massive antenna gain.
it does, but it is not as significant in normal weather conditions in the frequency ranges we're dealing with here (Likely sub 2.4 Ghz) [1], (General rule of thumb, Atmosphere absorbs everything except visible spectrum and 10 cm - 10 m wavelength), compare this to effect from the inverse square law. :)
Now this might be significant enough in directional waves with a huge constant multiplier (like a 'ideal' laser with no divergence). Someone can probably give insight on it here.
I'm curious why you ordered the wavelengths the way you did, you totally potholed my entire comprehension. 10 meters (28mhz) through 10cm (2800mhz) to me reads "left to right".
Oh I just wrote it like that cause that's what I remembered off the top of my head. For some reason my brain prefers using wavelengths for Radio waves (short wave radio, long wave radio) while frequencies when looking at the whole spectrum.
I seem to have a recollection that a cheap DIY system is to get two RCA 18" dishes (or larger) and remove the LNB from the dish (or whatever the horn part is called) and affix a "cantenna" or other waveguide to the arm, then, using the shortest, largest coax you can manage, connect the two cantenna/waveguides together, and aim them 180 degrees apart. I also seem to remember that putting this "halfway", even if halfway happened to be the highest point, wasn't ideal, it was something like 60/40 or 70/30.
Now-a-days, mikrotik or ubnt make doodads you can replace the LNB with that instead provides some wifi band, with PoE, so two of those up on a tower means you have active repeating, and you can probably push that solution out a lot further than 43km.
For the record, i put a USB wifi stick inside of a pirouette cookie can and it was a phenomenal "war driving" antenna. I use folded-dipole-fed yagi antennas now for doing wifi surveys, but in a pinch 802.11B 1000mW+ wifi cards are stacked up in a filing cabinet...
I want this to be part of a Hollywood movie. The protagonist is using his makeshift wifi setup to hack into a bank computer systems 43Km away from his location, and the bank people has no idea what is going on.
Why? Im not interested to answer why our router is not capable of that, and there is no such router in the market.
But on a serious note, some enterprise networks are unsecured enough, that you could probably login to root of their server from anywhere in the world.
Uhh, maybe a believable plotline like "Because the protagonist lives on a crumbling bridge with a military-trained dolphin encircled by cyborg psychopaths and Japanese megacorp military types while a highly infectious disease ravages remnant humanity and it's the end of the world?" https://www.imdb.com/title/tt0113481/
Hah! I only just watched that again a couple of weeks ago. With every re-watch I'm reminded of how terrible that movie is, and with all the passing of time between re-watches my memory of it continues to improve it until it seems worthy of another re-watch, and then I'm disappointed again.
He got the Wifi network to show up, but did he get a connection?
A bit higher quality point-to-point can be obtained with a bit of specialized equipment, Mikrotik has a bunch: https://mikrotik.com/product/MTAD-5G-30D3-PA for example, can go 40+ km.
RH 660s. I was skeptical at first, but it has really held up. I get really good signal reports and people always say they're surprised when I'm using a UV5R. I did drill the microphone port open a little wider by the way and I think that helped get some more gain/clarity as well.
Note that the RH 660s is 27 centimeters collapsed (11 inches), and 1 meter fully extended. It is rigid so not one of the floppy antennas. Collapsed it can take abuse, but when extended, bending is a risk.
Yes though the issue with many boafend radios is that they get banned in countries due to the bleed over other frequencies with the harmonics that break the rules. Covered better here: https://www.youtube.com/watch?v=V0EdkdNqczk
Baofeng caught heat because their radios are "unlocked" and in the US you cannot sell a radio that can set arbitrary frequencies in the FRS or GMRS bands (i forget which and it's irrelevant) - that is, they can't have keypads for one thing. Another issue is the power, if your radio can transmit on those two bands it must use <=1000mW, and all of the baofengs claim at least 5W transmit. Yet another reason is FRS/GMRS radios (whichever) must have *fixed* antennas.
The hash is just a reason for people to complain about them. As shipped from the factory, they're below the minimum hash levels required. Now, if you connect them to a cheap RF amplifier, you might run afoul of your license's rules about hashing up the harmonics.
Personally i prefer quansheng, as their speakers are louder, but i've never had an issue with a baofeng, especially with an external speaker/mic/antenna.
for posterity, i am extremely tired, and i welcome all elucidations and corrections, because i'm probably misremembering something, here.
I'm interested in this too— ten years or so ago I briefly got into long range wifi and bought a bunch of Ubiquiti gear (eg https://dl.ubnt.com/sr71a_datasheet.pdf). I was never successful in getting much more than 1km, even with their recommended antennas, but I was also working with watercraft, and I know that's hard mode for 2.4GHz, particularly when your antenna isn't able to be very far off the surface and only one end can be directional.
My impression though is that recent advancements in wifi have all been focused on getting high bandwidth at very short range, like same room line-of-sight, so I wouldn't assume there'd be much to be gained over b/g/n range performance in trying an ac- or 6-based system.
I made a comment above about the mavic drones having 8km-15km+ "line of sight" range with their handheld WAP controllers, and my having flown 2km without any issues. I doubt i'll ever go to the max range, but the power is there to get in and around trees, towers, whatever.
But with 440mhz i've transmitted data over 100 miles with full decode, from my driveway in a forest with a 6 element yagi. And a couple of years ago the first trans-atlantic 440 (UHF) transmission ever was accomplished.
The main issue with consumer wifi is that it's attenuated by water, and by nature, nature is full of water. That's why UBNT switched to 12ghz or 24ghz for their long range "airfiber" stuff, and hacker hams try and find 902MHZ band capable equipment, as 900mhz can "punch through" more vegetation than 2.4 or 5.8ghz.
I looked into long range 700 and 900 mhz systems a little bit, but it seemed like they were much more geared for control plane and telemetry type data than being saturated by a real time video feed.
Sorry i didn't reply sooner; yes, it's compressed video, but no frame drops on a clear path - i was able to get some i-frame weirdness a few times flying low over about 800 meters of trees, but no control issues.
> Certainly the author picked a beautiful region to do that kind of outdoor experiment in.
I don't mean to be argumentative, but I wonder what the ugliest 43.33km line of sight environment one could find would be? I imagine it's quite a good proxy for beauty, the maximum distance you can see.
There's an old dish antenna on the roof of my office that's pointed at a building we used to have in the 90s. I've wondered if I could get it to do anything cool, since my house is near the other building, but I don't think I could get an antenna high enough to get line of sight.
I would not be surprised at all if half or more of the dishes you see on commercial buildings are just vestigial, waiting for a signal that will never come again.
I have seen so much obsolete equipment that was left in place after replacement, some of it even still plugged in and running but not connected to anything. My favorite was an SCO box that beeped every day when it attempted to dial out to another server, long since decommissioned, over a phone line that was disconnected years ago.
In Los Angeles, California, there is a street called Sepulveda Blvd[1]. With extensions outside the city proper that have the same name, it once went 68.9 km end to end. Unfortunately for my hypothetical, it is not a straight line. But to me, it increases the odds that there is a street somewhere in the world's large metropolises that extends that far, and that it goes in a straight line, and you could see one end from the other, or at least the tops of buildings at either end could see each other. And then run-down is in the eye of the beholder.
Sepulveda is a good one, but i remember for a small section it's called "Imperial Highway"[1] And that is 169km, and the longest "straight" section is 50km[2]. I have driven that stretch many many times, and it takes hours, even at midnight.
The flatness makes this not a workable line of sight environment without towers due to the Fresnel zone though, the topography is the only reason this 43km link worked. Need to find an ugly hilly region, a much harder task than an ugly flat region :)
Overall I've had very good luck with the RF performance of external WiFi sticks. All too often the antennas inside laptops and phones are really an afterthought so the WiFi and (particularly) Bluetooth performance is awful.
I wonder if optical communications (with lasers or similar technology) would be a better choice in this situation, given that there is line of sight. With WiFi you are mostly limited by the legal requirements regarding transmit power.
Huh, did not know that Adam 9A4QV had also that site in addition to the Microwave Croatia blog and the LNA4ALL product blog.
Dude has done all kinds of cool stuff.
Very simplified, the signal is carried within a "fresnell zone", which s basically a 3d ellipse, that is relatively wide in the middle, and you'd have to cover a lot of that area, to block the signal... definitely more than a bird can do. https://en.wikipedia.org/wiki/Fresnel_zone#Clearance_calcula...
...unless the bird is standing infront of the transmitter/receiver.. then yes.
Not unless the bird is very close to the antennas at either end.
This is a neat exercise in antenna design. They've built high gain directional antennas and minimized transmission line losses. The 15 dBi antennas aren't even that remarkable; you can buy 30+ dBi wifi antennas.
My first 'better than dial-up' internet connection was a 2.4 GHz wifi service across 7 miles. On my end was a roof mounted aluminum parabolic grid antenna. It worked rather well and sometimes I wish I still had it.
Mine too! Though mine was the other way ‘round: I had a frac T1 line (256kbps iirc) and shared it with two other locations via a central omni antenna. Breezecom radios that I think were frequency-hopping (pre-spread-spectrum).
By line of sight, I think it means that it must be held in a tall enough mast not to be impacted by curvature of the earth. IIRC curvature of the earth limits line of sight to around 35km at 1.7m of height.
As a reference, folks that fly quadcopters, drones and flying wings often will transmit at 1 to 2 watts of power using line of sight. (Yes, at the edge of legal.) .02 watts is an insignificant amount of power for radio transmission.
I think it's much more fair to compare EIRP to EIRP, I don't know the gain of this crazy foil dish but it's probably pretty high. FPV is way harder since you can't use a highly directional transmit antenna and you're weight and size constrained, so you usually have dinky <3dB transmit gain and have to make up for it with big radio amplifiers.
Given a good control link such as ExpressLRS [0], people can also fly quadcopters 10+ kilometers on as little as 10 mW. See for example the range competition at [1].
Of course, the VTX is usually a different matter..
And why not? When I lived in Utah, I could easily hit a mountain-top ham radio repeater 20+ miles away with my little 5W handheld radio. A small weather balloon with a sub-1W transmitter can easily be heard at 100K feet altitude by people hundreds of miles away. So long as you have LoS, it's not problem.
I'm from Northern Utah and used to try to get a response from repeaters on my handheld radio. Being and introvert, that didn't really want to talk to people, it was one of the few things I really found interesting on HAM. I got pretty far but I was unable to hit the repeater in Wendover, NV (it was my next goal before I gave up). I used the local mountains to get myself up to an elevation where the curvature of the earth still allowed line of sight. I think Wendover is about 120 miles, line of sight, from the drive-able mountain top I selected. The expanse has two mountain ranges but there are some low spots that I thought I might be able to get through. The rest of it runs pretty flat over the salt flats.
Water molecules block 2.4Ghz spectrum that WiFi uses.
This is on purpose: the idea is to make the common WiFi (and Bluetooth) bands short range on purpose, so that many people within a city block can have local WiFi or local Bluetooth without interfering with each other.
So 2.4GHz over a long distance kinda goes against the design of WiFi / Bluetooth.
> Water molecules block 2.4Ghz spectrum that WiFi uses.
This isnt really true to any significant degree that matters, unless you are literally under water.
Rain fade is a thing, but is really only meaningful above 10GHz.
edit: I should note, its not that water droplets dont attenuate radio signals, its just that losses on a typical radio path are already huge in perfectly clear weather - you might lose 99.99999999% (100dB) or more of your signal strength between transmitter and receiver anyways.
The idea really was that 2.4GHz spectrum was already polluted by microwave ovens (because microwave ovens operate at the resonance frequency of water molecules), so it was left for public use.
> The microwaves in a microwave oven are not tuned to a resonant frequency of water. […]
> They heat the food through simple dielectric heating. […] Many types of molecules in the food absorb energy from the microwaves in this way, and not just water molecules.
2.4Ghz was a ISM junk band long before domestic microwave ovens arrived.
In fact 2.4Ghz was used for microwave ovens specifically because it was an ISM band, and therefore was available.
Almost. 2.4GHz had been set aside for consumer devices long before wifi. Microwaves are essentially very powerful unlicensed transmitters. Wifi devices are also all unlicensed transmitters. So 2.4GHz devices don't use the same space despite microwaves, they use it because that space had already been given over to unlicensed devices. If not for microwave ovens we might not have wifi as it is today.
Back when people were gobbling up the spectrum, the military didn't care about 2.4GHz because of the water absorption issues. It wasn't good for communication at long distances and so it was allowed for consumer devices.
1W at a very narrow bandwidth, is a very different thing form a 20mhz wide signal at 2.4ghz with ~71mW transmitter (100mW eirp with a dipole). Just calculate the peak power at those bandwidths.
You can work the ISS voice repeater with a handheld and it is around 400km away. Line of sight is everything and more so as you increase the frequency.
Without the bit rate there's not enough information to know if it's impressive or not. Define a low enough bit rate as a usable radio link and it can be 10 times that length with a 10th of the power level (Shannon-Hartley).
Perhaps given the limitations of 802.11 it means something but in theory it's meh.
A lot of people don't realize that Wi-Fi pretty much always operates in an extremely degraded mode (on the verge of not working at all - which is why a person walking past can be enough to break it) and it only seems to work for certain applications due to the magic of protocols such as TCP which can recover and essentially conceal packet loss. Jumping onto a call or some other real-time application (where retransmissions don't help and actually make the problem worse) will break the illusion and show you the reality of things.
There's also this misconception that speed is the only thing that matters, so "higher speed = better connection" and the vast majority of consumer-grade tools only ever test for this. This is where it will mislead you, as an unstable connection with short bursts of high speed will appear "better" (despite being fragile and completely unusable for anything real-time) than a slower but more reliable connection. In-browser speed tests are extremely bad for this because all the buffering & various layers could even fool the test code itself, making it believe it's getting a steady stream while in reality it's getting merely short bursts of data (in between tons of underlying TCP retransmissions). Iperf & ping are the tools of choice if you actually want to look into it - they are closer to the metal and will give you faster feedback (you will be able to actually see the dropouts due to packet loss).
Is there anything short of "get a better antenna" that helps address these issues? Recently had to switch to Wi-Fi for my desktop and have been frustrated with the inability to stop the connection from constantly being on the edge of not-working.
More access points is the only real solution. Ideally with Ethernet backhaul - mesh systems still consume spectrum which is what you're after - there's a finite amount of it (that you need to share with neighbors - especially on 2.4Ghz - and potential interference) and you need to maximize efficiency, so only use it for mobile devices that can't be wired - static access points (and other devices, TV, gaming console, etc) should be wired.
Mesh systems are a step up and can work but due to their cost (and also the fact it's usually hard to tell how well it'll perform without actually buying it and trying it out in the field) I would very much recommend just biting the bullet and doing it once and well with wired access points. A stop-gap solution would be to use powerline adapters to provide the Ethernet backhaul to the access points; you'll be limited in terms of bandwidth (mine top out at ~100Mbps in my current property, but went to ~300 in my previous one) but latency and packet loss-wise they're rock-solid and won't hog the precious Wi-Fi spectrum.
Forget consumer-grade crap as well, go for enterprise-grade equipment or at the very least "prosumer" grade such as Ubiquiti.
OP chose a hill 250 meters above sea level, and also has an antenna about 5m high. According to a calculator I found[1] that puts the distance of the horizon at roughly 57km. So plenty of range to still be in LOS of the transmitter.
(after posting this I thought of "liquid/atmosphere only" planets and how they are round so obviously my initial thought process on this was wrong, I thought water followed the bottom of its curved-earth container while trying to remain level on top)
This is nonsense. Do you believe that water forms a cliff at the edge of the beach so that its curvature tangent to the center of the body of water is zero? Hilarious
We're still in communication with Voyager 1, which is operating on a grand total of about 20W of RF power; and is currently about 14.5 billion miles away.