"Pay my troops no mind; they're just on a fact-finding mission."

Replying To Nicholas Eftimiades On Intelligence at the Speed of Thought

The link to his original post is here:

For future national security needs, the most stressing intelligence requirements will be for remote-sensing systems to detect, track, cross-que, and characterize fleeting targets in real time. This ability will require a global network of sensors to detect and track individuals, vehicles, chemicals, materials, and emanations and a space network backbone to move data.  Pervasive CCTV systems now present worldwide in airports, border crossings, railroads, busses, and on the streets of many cities will be integrated and supported by powerful computers, smart software agents, vast facial pattern and retina recognition databases, and communications infrastructure. These systems will be integrated with sensors and databases detecting, identifying, and characterizing spectral signatures, chemical compositions, DNA, effluents, sounds, and much more.

My Response:

There’s an interesting piece of open source software called Eureqa that can search for hidden mathematical equations in data. It’s amazing how quickly supervised and unsupervised learning algorithms along with gesture recognition are developing.

Over time we’ll develop better long-range sensors to detect emotional valence and arousal, so we can judge the details of a persons emotional state and correlate it with the rest of the data. Thermal and hyperspectral imaging can be used to judge the bloodflow to an area like the face, indicating stress. We have simple things, EPS and heartbeat sensors, eye tracking software, but it’s developing over time. Simple Microsoft Kinect sensors just build a simple stick-figure skeleton, but newer sensors are being developed that have more potential. This input will likely improve agent based modeling software as we will be able to have actual emotions as inputs.

Satellite launch costs are going down and NASA is turning LEO over to the private sector, so we can expect an increase in space-based sensors and services. That might lead to better climate detection models and therefore better advanced hurricane/tornado warning times.

It also applies to AGI research, much of the data we learn from comes in from vision among other senses. So from the perspective of building an AGI, adding more sensors means it can get smarter in much faster and in entirely new ways. When you talk about integrating that level of sensory information and processing it, you end up with intelligence that makes the differences between an Einstein and a village idiot seem as tiny as a grain of sand.

We can already load sounds into programs like Wolfram Mathematica and analyze them, extract data and then plot, graph or connect the data in hundreds of other ways. I’m not as familiar with MatLab but I know it has a wide range of functions too.

Right now the main concern is reducing interface friction so humans and machines can work together properly, but every year more functions are being added to the software and more data is being captured. Eventually we’re going to need a significant step up in intelligence to be able to work with it.

Thinking more on it, the alternative may be that things will get easier to use.

Programming languages have become somewhat simpler over time, as compilers catch up with being able to handle memory as good or better than humans going into things at the C/C++ level won’t be required, as long as there aren’t incompatibility issues. GUI’s have gotten better over the years as well.

I wonder how humans will choose to control access and connections between their AI/AGI programs as time goes on. The newer generation isn’t as concerned about privacy and are willing to give out tons of data on twitter and facebook.

Another area that’s still very empty: implant security. Most of these things can pick up wireless signals now, hackers have already figured out ways to mess with pacemakers and the like. Ditto for self-driving cars, we’ve already had guys hacking GPS’s to make them give false data. We’re going to have attack v. defense issues, security versus accessibility, the works.

(Technical note: Kinect skeleton drawing is done with the software, but improvements to hardware will effect it’s capabilities)

His response:

I agree with most of what you wrote but I don’t think lowering launch is going to lead to better climate detection models. Those space based sensors are excellent now. That is more a function of computing power and airborne/ground based sensors. And an increase in space-based sensors and services is going to be more a function of electronics miniaturization allowing more capability in orbit for the same launch price. But I agree launch costs will come down as well.

Also, Thermal and hyperspectral imaging can be used to judge the bloodflow to an area like the face but it is only useful if you have the spectral signature of that specific face at rest and under stress. Either that, or you have continuous monitoring and can watch the blood flows go up and down.

Implant security is an areas of concern. A UK college professor recently demonstrated infecting numerous devices with an imbedded bio chip.

Cool discussion.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: