Category: Technology

GoFCCYourself(.com)

You know what you find when you drain a swamp? A whole bunch of rotting detritus. I’m not going to pretend astonishment that a former Associate General Counsel from Verizon thinks net neutrality is a terrible idea. I remember getting an e-mail message from my employer, another network provider, detailing how this terrible proposal was going to drive us all out of business. Or something similarly over-dramatic.

Facilitating public comment on Executive branch proceedings, such as GoFCCYourself.com, is an interesting idea. Take a circuitous government web site that ostensibly allows individuals to post comments on issues and circumvent the terrible user interface by getting your own URL and I assume including the appropriate POST headers to get individuals in exactly the right place to submit their comments.

I’ve used this short-cut to submit my opinion to the FCC, but I also forwarded the same message to my rep in the House and my two state Senators:

I have submitted this to the FCC for Docket 17-108 but wanted to include you as well. If the FCC does roll back net neutrality, as their chairman indicates is his desire, I beseech you to ready legislative controls to prevent ISPs from using speed controls to essentially censor Internet content.

I am writing to express my support for “net neutrality” — while you want to claim it reduces carrier investment or innovation, customer acquisition and retention drives carrier investment and innovation. Lowered cost of operations, creating a service that allows a higher price point, or offering a new service unavailable through a competitor drive innovation. Allowing a carrier to create a new revenue stream by charging content providers for faster access is not innovation – QoS has been around for decades. And it isn’t like the content is being delivered to the Internet for free. Content providers already pay for bandwidth — and a company like Netflix probably paid a LOT of money for bandwidth at their locations. If Verizon didn’t win a bid for network services to those locations, that’s Verizon’s problem. Don’t create a legal framework for every ISP to profit from *not* providing network services for popular sites; the network provider needs to submit a more competitive bid.

What rolling back net neutrality *does* is stifle customers and content providers. If I, as a customer, am paying 50$ a month for my Internet service but find the content that I *want* is de-prioritized and slowed … well, in a perfect capitalist system, I would switch to the provider who ‘innovates’ and goes back to their 2017 configurations. But broadband access – apart from some major metro areas – is not a capitalist system. Where I live, outside of the Cleveland suburbs, I have my choice of the local cable company or sat – sat based Internet introduces a lot of latency and is quite expensive for both the customer and the operator (and has data limits, which themselves preclude a lot of network-intensive traffic that ISPs wish to de-prioritize). That’s not a real choice — pay 50$ to this company who is going to de-prioritize anyone who doesn’t pay their network bandwidth ransom or pay 100$ to some other company that is unable to provide sufficiently low latency to allow me to work from home. So add a hour of commute time, fuel, vehicle wear, and reduced family time to that 100$ bill.

Rolling back net neutrality stifles small businesses — it’s already difficult to compete with large corporations who have comparatively unlimited budgets for advertising and lawyers. Today, a small business is able to present their product online with equal footing. In 1994, I worked at a small University. One of my initiatives was to train departmental representatives on basic HTML coding so the college would have an outstanding presence on the Internet. First hour of the first day of the training session included a method for checking load times off campus without actually having to leave the campus network. On campus, we were 10 meg between buildings and the server room and anything loaded quite quickly. At home, a prospective student was dialing in on a 28.8 modem. If your content is a web page for MIT, a prospective engineering student may be willing to click your site, go eat dinner, and come back. Load time isn’t as much of a problem for an organisation with a big name and reputation. Unknown little University in Western PA? Click … wait … wait, eh, never mind. The advent of DSL was amazing to me because it provided sufficient bandwidth and delivered content with parity that allowed an unknown Uni to offer a robust web site with videos of the exciting research opportunities available to students and the individual attention from professors that small class sizes allow. No longer did we need to restrict graphics and AV on our site because we weren’t a ‘big name’ University. That there ever was a debate about removing this parity astonished me.

Aside from my personal opinion, what is the impact of non-neutral networks on free speech? Without robust legal controls, ISPs engage in a form of quasi-censorship. How do you intend to prevent abuse of the system? Is a large corporation going to be able to direct “marketing” dollars to speeding up their page to the harm of their competitors? Can the Coca-Cola Company pay millions of dollars to have their content delivered faster than PepsiCo’s? Is the ISP then the winner in a bidding war between the two companies? What about political content? Does my ISP now control the speed at which political content is delivered? What happens when Democrats raise more money in the Cleveland metro area and conservative views are relegated to the ‘slow’ lane? What happens when the FCC gets de-prioritized because ISPs want even less regulation??

I would still worry about the legal controls to prevent quasi-censorship, but I would object less if the FCC were to implement the net neutrality requirements like some of the telco regulations for CLEC’s where there were no ILEC’s had been — where there is no or limited competition, net neutrality is a requirement. Where there are a dozen different ISP options, they can try selling the QoS’d packages. Polls and voting aside, the ISP will find out exactly how many customers or content providers support non-neutral networks.

Owntracks Stuck In “Connecting” To MQTT When Using WebSockets

Our home automation presence is maintained through an Android app, OwnTracks, which updates a Mosquitto server via a WebSockets reverse proxy. Mosquitto runs on a Fedora 25 server and was installed from the default RPM repository.

Recently, we stopped receiving location updates – both of our Android clients were stuck “Connecting” to the MQTT server. Nothing appeared in the Apache access or error logs, and capturing network traffic only got a small number of packets (TCP session overhead ‘stuff’). Even bypassing the reverse proxy and using the internal network to communicate directly to the Mosquitto server only created a couple of packets. Using a test client (http://www.hivemq.com/demos/websocket-client/), I saw strange connection failures — so I knew the problem was not specific to the OwnTracks client.

It seems there was a bug in libwebsockets v2.1.1 (and possibly others) — when we updated our Fedora installation, the new libwebsockets broke our MQTT over WebSockets. Currently, the Fedora repository still contains an impacted version of libwebsockets. To resolve the issue, I built the latest stable libwebsockets and built mosquitto against this updated library.

Process: The first step is to remove the dnf managed packages (rpm -e libwebsockets libwebsockets-devel mosquitto). Then build libwebsockets and mosquitto.

To Build LibWebSockets:

wget https://github.com/warmcat/libwebsockets/archive/master.zip
unzip master.zip
cd libwebsockets-master/
mkdir build
cd build
cmake ..
make
make install
cp libwebsockets.pc /usr/lib/
cp lws_config.h /usr/include/
cp ../lib/libwebsockets.h /usr/include/
cp ./lib/libwebsockets.so /usr/lib/

To Build Mosquitto:

wget https://github.com/eclipse/mosquitto/archive/master.zip
unzip master.zip
cd mosquitto-1.4.11
vi config.mk # Line 68, change to “WITH_WEBSOCKETS:=yes”
make
make install

Start the Mosquitto server and try again. Voila, presence works again!

Compiling Open ZWave On Fedora 25

Mostly writing this down for me, next time we need to run Open ZWave and try to build the latest version:

Download libmicrohttpd
Gunzip & untar it
cd libmicrohttpd
./configure
make
make install

Download and build the open-zwave library

mkdir /opt/ozw
cd /opt/ozw
git clone https://github.com/OpenZWave/open-zwave.git
cd open-zwave-master
make

Find error in build that says you don’t have libudev.h, install systemd-devel (dnf install systemd-devel) & try that make again.

Download open-zwave-control-panel
cd /opt/ozw
git clone https://github.com/OpenZWave/open-zwave-control-panel.git
cd open-zwave-control-panel-master

Open the Makefile and find the following line:
OPENZWAVE := ../

Change it to:
OPENZWAVE := ../open-zwave-master

Then find the section that says:
# for Linux uncomment out next three lines
LIBZWAVE := $(wildcard $(OPENZWAVE)/*.a)
#LIBUSB := -ludev
#LIBS := $(LIBZWAVE) $(GNUTLS) $(LIBMICROHTTPD) -pthread $(LIBUSB) -lresolv

# for Mac OS X comment out above 2 lines and uncomment next 5 lines
#ARCH := -arch i386 -arch x86_64
#CFLAGS += $(ARCH)
#LIBZWAVE := $(wildcard $(OPENZWAVE)/cpp/lib/mac/*.a)
LIBUSB := -framework IOKit -framework CoreFoundation
LIBS := $(LIBZWAVE) $(GNUTLS) $(LIBMICROHTTPD) -pthread $(LIBUSB) $(ARCH) -lresolv

And switch it around to be Linux … the Makefile becomes:
# for Linux uncomment out next three lines
LIBZWAVE := $(wildcard $(OPENZWAVE)/*.a)
LIBUSB := -ludev
LIBS := $(LIBZWAVE) $(GNUTLS) $(LIBMICROHTTPD) -pthread $(LIBUSB) -lresolv

# for Mac OS X comment out above 2 lines and uncomment next 5 lines
#ARCH := -arch i386 -arch x86_64
#CFLAGS += $(ARCH)
#LIBZWAVE := $(wildcard $(OPENZWAVE)/cpp/lib/mac/*.a)
#LIBUSB := -framework IOKit -framework CoreFoundation
#LIBS := $(LIBZWAVE) $(GNUTLS) $(LIBMICROHTTPD) -pthread $(LIBUSB) $(ARCH) -lresolv

ln -sd ../open-zwave/config
make

Then you can run it:
./ozwcp -p 8889

./ozwcp: error while loading shared libraries: libmicrohttpd.so.12: cannot open shared object file: No such file or directory

strace it (strace ./ozwcp -p 8889)

open(“/lib64/tls/x86_64/libmicrohttpd.so.12”, O_RDONLY|O_CLOEXEC) = -1 ENOENT (No such file or directory)
stat(“/lib64/tls/x86_64”, 0x7ffefb50d660) = -1 ENOENT (No such file or directory)
open(“/lib64/tls/libmicrohttpd.so.12”, O_RDONLY|O_CLOEXEC) = -1 ENOENT (No such file or directory)
stat(“/lib64/tls”, {st_mode=S_IFDIR|0555, st_size=4096, …}) = 0
open(“/lib64/x86_64/libmicrohttpd.so.12”, O_RDONLY|O_CLOEXEC) = -1 ENOENT (No such file or directory)
stat(“/lib64/x86_64”, 0x7ffefb50d660) = -1 ENOENT (No such file or directory)
open(“/lib64/libmicrohttpd.so.12”, O_RDONLY|O_CLOEXEC) = -1 ENOENT (No such file or directory)
stat(“/lib64”, {st_mode=S_IFDIR|0555, st_size=122880, …}) = 0
open(“/usr/lib64/tls/x86_64/libmicrohttpd.so.12”, O_RDONLY|O_CLOEXEC) = -1 ENOENT (No such file or directory)
stat(“/usr/lib64/tls/x86_64”, 0x7ffefb50d660) = -1 ENOENT (No such file or directory)
open(“/usr/lib64/tls/libmicrohttpd.so.12”, O_RDONLY|O_CLOEXEC) = -1 ENOENT (No such file or directory)
stat(“/usr/lib64/tls”, {st_mode=S_IFDIR|0555, st_size=4096, …}) = 0
open(“/usr/lib64/x86_64/libmicrohttpd.so.12”, O_RDONLY|O_CLOEXEC) = -1 ENOENT (No such file or directory)
stat(“/usr/lib64/x86_64”, 0x7ffefb50d660) = -1 ENOENT (No such file or directory)
open(“/usr/lib64/libmicrohttpd.so.12”, O_RDONLY|O_CLOEXEC) = -1 ENOENT (No such file or directory)
stat(“/usr/lib64”, {st_mode=S_IFDIR|0555, st_size=122880, …}) = 0

Huh … not looking in the right place. I’m sure there’s a right way to sort this, but we’re using Open ZWave for a couple of minutes to test some ZWave security stuff. Not worth the time:

ln -s /usr/local/lib/libmicrohttpd.so.12.41.0 /usr/lib64/libmicrohttpd.so.12

Try again (./ozwcp -p 8889). Voila, “2017-04-17 20:35:05.223 Always, OpenZwave Version 1.4.0 Starting Up”. Use your browser to hit http://<ipaddress>:8888 to access the Open ZWave Control Panel.

Uninformed Upgrades (PHP 5 => 7)

TL;DR: Check the list of what is being updated before you let an OS automatically update its programs.

We have a home automation / MythTV / ZoneMinder server with automatic updates disabled. In the process of updating OpenHAB to OpenHAB2, Scott suggested we update everything else while we’re at it. No big, did a quick “dnf update” … got a gig of packages downloaded, waiting for >1400 packages to install, and rebooted.

PHP could not talk to MySQL. At all. ZoneMinder just threw an error saying we didn’t have the PHP MySQL module installed (it worked half an hour ago, so it is INSTALLED). MythWeb completely failed to load – just a white screen. The quick web view of OpenHAB persistence history threw a class not found error.

I checked to see if the extensions were loaded (use the command “print_r(get_loaded_extensions());” in a PHP page) – huh, a LOT of my modules were missing. But there weren’t any useful errors anywhere indicating why.

I modified the php.ini file to show startup errors.

[root@fedora01 conf.modules.d]# grep display_startup_errors /etc/php.ini
; display_startup_errors
display_startup_errors = On

Oooooh, now there are errors! A lot of them. Not particularly useful, but at least a good clue that this isn’t going to go so well for me:

PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/pdo.so’ – /usr/lib64/php/modules/pdo.so: undefined symbol: zend_ce_exception in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/mysqlnd.so’ – /usr/lib64/php/modules/mysqlnd.so: undefined symbol: zend_hash_str_del in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/bcmath.so’ – /usr/lib64/php/modules/bcmath.so: undefined symbol: _emalloc_16 in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/bz2.so’ – /usr/lib64/php/modules/bz2.so: undefined symbol: zend_fetch_resource2_ex in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/calendar.so’ – /usr/lib64/php/modules/calendar.so: undefined symbol: _emalloc_32 in Unknown on line 0
PHP Warning: PHP Startup: ctype: Unable to initialize module\nModule compiled with module API=20151012\nPHP compiled with module API=20131226\nThese options need to match\n in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/curl.so’ – /usr/lib64/php/modules/curl.so: undefined symbol: zend_list_close in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/dom.so’ – /usr/lib64/php/modules/dom.so: undefined symbol: zend_ce_exception in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/exif.so’ – /usr/lib64/php/modules/exif.so: undefined symbol: zend_hash_str_exists in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/fileinfo.so’ – /usr/lib64/php/modules/fileinfo.so: undefined symbol: zend_list_close in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/ftp.so’ – /usr/lib64/php/modules/ftp.so: undefined symbol: zend_fetch_resource2 in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/gd.so’ – /usr/lib64/php/modules/gd.so: undefined symbol: zend_list_close in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/gettext.so’ – /usr/lib64/php/modules/gettext.so: undefined symbol: zend_parse_arg_str_slow in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/iconv.so’ – /usr/lib64/php/modules/iconv.so: undefined symbol: _zval_get_string_func in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/json.so’ – /usr/lib64/php/modules/json.so: undefined symbol: _emalloc_56 in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/mbstring.so’ – /usr/lib64/php/modules/mbstring.so: undefined symbol: zend_hash_str_del in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/mysqlnd.so’ – /usr/lib64/php/modules/mysqlnd.so: undefined symbol: zend_hash_str_del in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/phar.so’ – /usr/lib64/php/modules/phar.so: undefined symbol: zend_sort in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/posix.so’ – /usr/lib64/php/modules/posix.so: undefined symbol: _zend_hash_str_update in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/shmop.so’ – /usr/lib64/php/modules/shmop.so: undefined symbol: zend_list_close in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/simplexml.so’ – /usr/lib64/php/modules/simplexml.so: undefined symbol: zend_ce_exception in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/sockets.so’ – /usr/lib64/php/modules/sockets.so: undefined symbol: zend_hash_str_del in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/sqlite3.so’ – /usr/lib64/php/modules/sqlite3.so: undefined symbol: zend_ce_exception in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/sysvmsg.so’ – /usr/lib64/php/modules/sysvmsg.so: undefined symbol: _emalloc_64 in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/sysvsem.so’ – /usr/lib64/php/modules/sysvsem.so: undefined symbol: _emalloc_24 in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/sysvshm.so’ – /usr/lib64/php/modules/sysvshm.so: undefined symbol: zend_list_close in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/tidy.so’ – /usr/lib64/php/modules/tidy.so: undefined symbol: _zend_hash_str_update in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/tokenizer.so’ – /usr/lib64/php/modules/tokenizer.so: undefined symbol: _emalloc_large in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/xml.so’ – /usr/lib64/php/modules/xml.so: undefined symbol: _zend_hash_str_add in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/xmlwriter.so’ – /usr/lib64/php/modules/xmlwriter.so: undefined symbol: _emalloc_16 in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/xsl.so’ – /usr/lib64/php/modules/xsl.so: undefined symbol: dom_node_class_entry in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/mysql.so’ – /usr/lib64/php/modules/mysql.so: undefined symbol: mysqlnd_connect in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/mysqli.so’ – /usr/lib64/php/modules/mysqli.so: undefined symbol: zend_ce_exception in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/pdo_mysql.so’ – /usr/lib64/php/modules/pdo_mysql.so: undefined symbol: mysqlnd_allocator in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/pdo_sqlite.so’ – /usr/lib64/php/modules/pdo_sqlite.so: undefined symbol: php_pdo_unregister_driver in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/wddx.so’ – /usr/lib64/php/modules/wddx.so: undefined symbol: zend_list_close in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/xmlreader.so’ – /usr/lib64/php/modules/xmlreader.so: undefined symbol: dom_node_class_entry in Unknown on line 0
PHP Warning: PHP Startup: Unable to load dynamic library ‘/usr/lib64/php/modules/json.so’ – /usr/lib64/php/modules/json.so: undefined symbol: _emalloc_56 in Unknown on line 0

Turns out DNF installed PHP 7, but didn’t do anything to remove the PHP 5 modules from my Apache configuration:

[root@fedora01 tmp]# cd /etc/httpd/modules
[root@fedora01 modules]# grep php *
Binary file libphp5.so matches
Binary file libphp5-zts.so matches
Binary file libphp7.so matches
Binary file libphp7-zts.so matches

[root@fedora01 modules]# mkdir /tmp/oldphp
[root@fedora01 modules]# mv libphp5* /tmp/oldphp

And remove them from the conf.modules.d too (if you just remove the module files but try to load them in the conf.modules.d … Apache will just fail to load. You could remove them from conf.modules.d … but I don’t want a lot of no-longer-used files sitting there to confuse me in a year or two!)

[root@fedora01 modules]# cd /etc/httpd/conf.modules.d/
[root@fedora01 conf.modules.d]# grep php *
10-php.conf: LoadModule php5_module modules/libphp5.so
10-php.conf: LoadModule php5_module modules/libphp5-zts.so
15-php.conf:# Cannot load both php5 and php7 modules
15-php.conf:<IfModule !mod_php5.c>
15-php.conf: LoadModule php7_module modules/libphp7.so
15-php.conf:<IfModule !mod_php5.c>
15-php.conf: LoadModule php7_module modules/libphp7-zts.so

[root@fedora01 conf.modules.d]# mv 10-php.conf /tmp/oldphp/

Then restart Apache without PHP 5:

root@fedora01 conf.modules.d]# service httpd start
Redirecting to /bin/systemctl start httpd.service

Voila, perfectly functioning web sites. And, yeah, I should probably check the list of “what will be updated” when I update a server. Would save HOURS of reading through strace output to find out old versions were still hanging about.

 

Smart Home (In)Security

I’ve seen a lot of articles recently about hacked IoT devices (and now one about a malicious company disrupting the customer’s service in retaliation for poor reviews (and possibly abusive calls to technical support). I certainly don’t think *everything* needs to be connected to the Internet. If you want to write messages on toast remotely, whatever … but beyond gimmicks, there are certainly products where the Internet offers no real advantage. But a lot of articles disparage the idea of a smart home based on goofy products.

There are devices that are more convenient than their ‘dumb’ counterparts. Locks that unlock when you are nearby. Garage lights that come on when the door is unlocked or opened. And if that was the extent of home automation, I guess you could still call it a silly fad.

But there are a LOT of connected devices that save resources: Exterior lighting that illuminates as you near your house. With motion detectors controlling light switches and bulbs, you (or the kids) cannot forget to turn out the lights. An outlet that turn OFF to eliminate draw when appliances are in ‘standby’ mode saved us about 50$/year just on the television/receiver. Use moisture sensors to control a sprinkler system so the grass is only watered when there is actual need. Water flow sensors that can alert you to unusual usage (e.g. when the water filter system gasket goes and it starts dumping water through the thing 24×7).

And some that prevent real damages to your home or person. If your house uses combustion for heat, configure the carbon monoxide sensor to shut off the HVAC system when CO levels are too high. Leak sensors shut off the water mains when a leak is detected (and turn off appliances in the wet area if there’s potential for shorting).

The major security problem with any IoT device, smart home systems included, is that you’ve connect private resources to the Internet. With all the hackers, punks, and downright malicious people out there. And from a privacy standpoint, you are providing information that can be mined to enhance marketing profiles — very carefully read the privacy policies of any company whose platform you will be using. Maybe a ‘smart’ coffee machine sounds good to you — but are they collecting (and potentially selling to third parties) information about how many cups of coffee you brew and the times of day you brew them? If you care is a personal decision, but it’s something that should be considered just the same.

When each individual device has its own platform, the privacy and security risks grow. A great number of these devices don’t need to be connected to the INTERNET directly but rather a relay point (hub). From a business perspective, this is a boon … since you have a Trane furnace (big money, not apt to be replaced yearly), you should also buy these other products that we sell and pay the monthly recurring to use our Nexia platform for all of your other smart devices. Or since you have a Samsung TV with a built-in hub … you should not only buy these other Samsung products, but hook all of your other smart ‘things’ up to SmartThings. And in a year or two when you’re shopping for a new TV … wait, you need one with a SmartThings hub or you’re going to have to port your existing configuration to a new vendor. Instant customer loyalty.

For an individual, the single relay point reduce risk (it’s not one of a dozen companies that need to be compromised to affect me, just this one) and confusion (I only have to keep track of one company’s privacy policy). *But* it also gives one company a lot more information. The device type is often indicative, but most people name the devices according to location (i.e. bedroom light, garage light, front door). Using SmartThings, Samsung knew when we went to bed and woke up, that we ate breakfast before brushing teeth (motion in hallway, motion in kitchen, water usage, power draw on appliances, motion in hallway, motion in bathroom, water usage) or showering (power draw on hot water tank, increased water usage). Which rooms we frequented (motion), when we watched TV (not what we watched, but when), when we left the house (no motion, presence change). How often we wash laundry (power draw on washer, water usage) and dishes (power draw in dishwasher, water usage). Temperature in the house (as reported from multi-sensor devices or from a smart thermostat), if we change settings for day/night. How often we drive a car (garage door open/closed with presence change, or speed of location change on presence), how much time we spend away from home. How often we have overnight guests (motion in guest bedroom at night).

And, yeah, the profile they glean is a guess. I might open the garage door when mowing grass. Or I might have rooms with no motion sensors for which they cannot account. But they have a LOT of data on which to base their guesses and no one selling targeted advertising profiles claims to be 100% accurate. Facebook’s algorithm, for quite some time, had me listed as a right-leaning Trump supporter. I finally tired of seeing campaign ads on their site and manually updated my advertising profile. Point is, one company has a lot of data from which they build fairly good targeted profiles. How much of our house is actually used (a lot of bedrooms that rarely get motion, get a ‘downsizing specialist’ real estate flyer. All rooms constantly with motion, get a flyer specific to finding a larger home to give you all some space). If the HVAC system is connected, they could create a target group “people who could use additional insulation or sealing in their house” (outdoor temp for location v/s indoor temp for location v/s energy draw).

In some ways, it’s cool that a company might be able to look at my life and determine a need of which I am not even aware. Didn’t realize how much of our energy bill was HVAC – wow, tightening the house and insulation will save how much?! But it’s also potentially offensive: yeah, we could use a bigger house for all of these people. We could also use a bigger pay cheque, what of it? Yeah, the kids moved out … but this is our house and why would you tell me I should be leaving? And generally invasive — information that doesn’t really cause harm but they’ve got no reason to know either.

What articles highlighting the insecurity of IoT devices seem to miss is that the relay point can reside on your local network with no Internet access. We personally use OpenHAB – which enables our home automation to function completely inside our local network. You trust the developers (or don’t, ours is open source … you can read the whole thing if you don’t want to trust developers), but you own the data and what is done with it.

You don’t need an expensive dedicated server to host your own home automation controller – a Raspberry PI will do. What you do need is technical knowledge and a good bit of time (or hire someone to do it for you, in which case you need money and someone else’s time). But the end result is the same — physical presence is required to compromise the system. Since physical presence will also let you bump locks, smash windows, cut power, flick light switches, open doors … you’re not worse off than before.

Internet Privacy (Or Lack Thereof)

Well, the House passed Senate Joint Resolution 34 — which essentially tells the FCC that it cannot have the policy it enacted last year that prohibits ISPs from selling an account’s browsing history. What exactly does that mean? Well, they won’t literally sell your browsing history — anyone bored enough to peruse mine … I’d happily sell my browser history for the right price. But that’s not what is going to happen. For one thing, they’re asking for lawsuits — you visit a specific drug’s web site, or a few cancer treatment centres and your usage is indicative of specific medical conditions. An insurance company or employer buys your history and uses it to fire you or increase rates, and your ISP has created actual damages.

What will likely happen is the ISPs become more effective sellers of online advertising. They offer a slightly different service than current advertising brokers. The current brokers use cookies embedded on customer’s sites to track your browsing activity. If you clear your cookies, some of their tracking history is lost as well. If you use multiple computers (or even multiple browsers on one computer), they do not have a complete picture of your browsing because cookies are not shared between browsers or computers. If you browse in private mode (or block cookies, or use a third-party product to reduce personalized advertising), these advertisers may not be able to glean much about you at all. The ISP does not have any of these problems — no matter what computer or browser I use at home, the ISP will see the traffic. Since their traffic history is maintained on their side … nothing I can do to clear the history. Browse in private mode or block cookies and you’re still making a request that transits the ISP’s network.

The ISPs have disadvantages, though, as well. When you are using encrypted protocols (HTTPS, SSH, etc) … the ISP can see the destination IP and a bunch of encrypted gibberish. Now *something* about you can be determined by the destination IP (hit 151.101.129.164 a lot and I know you read the NYTimes online). Analysis of the encrypted content can be used to guess the content — that’s a bit of research that I don’t believe is currently being used for advertising, but there are researchers who catalog patterns of bitrate negotiation on YouTube videos and use it as a fingerprint to guess what video is being watched using only the encrypted traffic. Apart from some guessing, though, the ISP does not know exactly what is being done over encrypted communication channels (even the URL being requested – so while they may know I read the NYTimes, they don’t know if I read the political headlines, recipes, or concert listings out on LI). Cookie-based advertisers can, however, track traffic to encrypted (HTTPS) web sites. This is because site operators embed the cookie in their site … so where an ISP cannot read the data you transmit with an HTTPS site, the server in question *can* (otherwise it wouldn’t know what site you requested).

So while an ISP won’t sell someone a database of the URLs you’ve accessed last week, they will use that information to form advertising buckets and sell a specific number of ads being served to “people who browse yarn stores” or “people who read Hollywood gossip” or “right-leaning political activists”. Because they have limitations as well, ISP ad brokerages are unlikely to replace the cookie based individualized advertising. I suspect current advertising customers will spread their advertising dollars out between the two — get someone who can target you based on browsing over HTTPS and someone who can target you even if you block cookies.

What about using VPN or TOR to anonymize your traffic? Well, that helps — in either case, your ISP no longer can determine the specific web sites you view. *But* they can still categorize you as a technically saavy and security conscious individual and throw you into the “tech stuff” and “computer security stuff” advertising buckets.

You can opt out of the cookie-based individualized advertising — Network Advertising Initiative or Digital Advertising Alliance — an industry move that I assume was meant to quell customer anger and avoid government regulations (i.e. enough people get angry enough and are not provided some type of redress, they’ll lobby their state/federal government to DO SOMETHING about it). The ISPs will likely create a similar set of policies and a process to opt-out. Which means the being passed to the president for signature essentially changed the ISP’s ability to use my individual browsing history from an opt-in (maybe as a condition of a lower price rate) to an opt-out (where I have to know to do it, go through the trouble of finding how to do it, and possibly even keep renewing my opt-out). Not as bad as a lot of reporting sounds, but also not a terribly constituant-friendly move.

A couple of links to the current targeted marketing opt-outs for companies which whom I do business so bothered to waste a few hours trying to determine how to opt-out:

https://pc2.mypreferences.com/Charter/TargetedDigitalMarketingAds

https://www.t-mobile.com/company/privacy-resources/your-privacy-choices/ad-options.html

 

Selling Internet Usage

Thanks, Senate Republicans, for eliminating the regulation prohibiting ISPs from commoditizing individual’s internet usage. The headlines I’ve seen for the past day (“your internet browsing history can now be sold”) fail to take into account how well targeted advertising works … and the real world answer seems to be ‘it depends on how targeted’. Target, many years ago, ran some experimental ad campaigns based on data mining predictions. A combination of purchases indicate that a woman is pregnant — and reality about dealing with a baby mean that whatever you start doing is apt to be the thing you do for a few years. Buy diapers at Target pre-birth, and Target has your diaper business for a few years. So identifying pregnant individuals and getting their initial business is a huge boon.

Except people don’t like when you know something deeply personal about them — especially something they have not yet told their friends. Or found out about their kids — a man became quite irate when a “congrats, here are a bunch of coupons for baby things for you” flyer arrived for his daughter. His underage and certainly not pregnant daughter. The man expressed his anger to a manager at the local Target store. And had to ring the manager back later, after his daughter broke the news to her family. A company knew his daughter was pregnant before she braved telling her parents. I’m sure some adults found the flyers equally upsetting – how do they know?? I only told my husband and our parents … are they bugging our phones!?! Target ended up creating customized general flyers that happened to include a few coupons for baby stuff … along side chain saws, motor oil, and grills. Effective, but didn’t freak out potential customers.

My point being – companies don’t want to alienate potential customers. I search for a particular yarn online, and a few days later a flyer shows up for that yarn … I now have creepy negative feelings about the company sending the flyer.

Windows 10 Tablet Mode

Now I know the *right* answer is “don’t let your four year old randomly click stuff on your computer” … which is an extension of “don’t let your cat walk/sleep on your keyboard” (a maxim I could never convince my mom was a good rule for computer usage). But I booted my Windows 10 computer to find I no longer had a desktop. I’ve got some theme-colored background with a couple of icons. I can go to all apps. I can get into settings and all sorts of things.

Not a Windows desktop:

And I didn’t even know what this thing was called to Google how to get rid of it. A bunch of random clicking later, and I’ve discovered Windows 10 has a “tablet mode”. Which was turned on – and just like I could never figure out how a sleeping cat managed to hit the three-key command required to rotate an Intel graphics card display by 90º, I have no clue how Anya’s gotten into this particular mode. Luckily it’s easy to undo (click it to turn it off); voila, I’ve got a desktop again.

LAPS For Local Computer Administrator Passwords

Overview

LAPS is Microsoft’s solution to a long-existing problem within a corporation using Windows computers: when you image computers, all of the local administrator passwords are the same. Now some organizations implemented a process to routinely change that password, but someone who is able to compromise the local administrator password on one box basically owns all of the other imaged workstations until the next password change.

Because your computer’s local administrator password is the same as everyone else’s, IT support cannot just give you a local password to access your box when it is malfunctioning. This means remote employees with incorrect system settings end up driving into an office just to allow an IT person to log into the box.

With LAPS, there is no longer one ring to rule them all – LAPS allows us to maintain unique local administrator passwords on domain member computers. A user can be provided their local administrator password without allowing access to all of the other domain-member PCs (or a compromised password one one box lets the attacker own only that box). A compromised box is still a problem, but access to other boxes within the domain would only be possible by retrieving other credentials stored on the device.

Considerations

Security: The end user is prevented from accessing the password or interacting with the process. The computer account manages the password, not the user (per section 4 LAPS_TechnicalSpecification.docx from https://www.microsoft.com/en-us/download/details.aspx?id=46899).

Within the directory, read access is insufficient (per https://blogs.msdn.microsoft.com/laps/2015/06/01/laps-and-password-storage-in-clear-text-in-ad/) to view the attribute values. In my proposed deployment, users (even those who will be retrieving the password legitimately) will use a web interface, so a single service acct will have read access to the confidential ms-Mcs-AdmPwd attribute and write access to ms-Mcs-AdminPwdExpirationTime. There are already powershell scripts published to search an improperly secured directory and dump a list of computer names & local administrator passwords. You should run Find-AdmPwdExtendedrights -identity :<OU FQDN> to determine who has the ability to read the password values to avoid this really embarrassing oversight.

Should anyone have access to read the ms-Mcs-AdmPwd value beyond the service account? If the web interface goes down for some reason, is obtaining the local administrator password sufficiently important that, for example, help desk management should be able to see the password through the MS provided client? Depends on the use cases, but I’m guessing yes (if for no other reason than the top level AD admins will have access and will probably get rung up to find the password if the site goes down).

In the AD permissions, watch who has write permission to ms-Mcs-AdminPwdExpirationTime as write access allows someone to bump out the expiry date for the local admin password. Are we paranoid enough to run a filter for expiry > GPO interval? Or does setting “not not allow password expiration time longer than required by policy” to Enabled sufficently mitigate the issue? To me, it does … but the answer really depends on how confidential the data on these computers happens to be.

With read access to ms-Mcs-AdmPwdExpirationTime, you can ascertain which computers are using LAPS to manage the local administrator password (a future value is set in the attribute) and which are not (a null or past value). Is that a significant enough security risk to worry about mitigating? An attacker may try to limit their attacks to computers that do not use LAPS to manage the local admin password. They can also ascertain how long the current password will be valid.

How do you gain access to the box if the local admin password stored in AD does not work (for whatever reason)? I don’t think you’re worse off than you would be today – someone might give you the local desktop password, someone might make you drive into the office … but bears considering if we’ve created a scenario where someone might have a bigger problem than under the current setup.

Does this interact at all with workplace join computers? My guess is no, but haven’t found anything specific about how workplace joined computers interact with corporate GPOs.

Server Side

Potential AD load – depends on expiry interval. Not huge, but non-zero.

Schema extension needs to be loaded. Remove extended rights from attribute for everyone who has it. Add computer self rights. Add control access for web service acct – some individuals too as backup in case web server is down??

Does a report on almost expired passwords and notify someone have value?

Client Side

Someone else figures this out, not my deal-e-o. Set GPO for test machines, make sure value populates, test logon to machine with password from AD. Provide mechanism to force update of local admin password on specific machine (i.e. if I ring in and get the local admin password today, it should get changed to a new password in some short delta time).

Admin Interface

Web interface, provide computer name & get password. Log who made request & what computer name. If more than X requests made per user in a (delta time), send e-mail alert to admin user just in case it is suspicious activity. If more than Y requests made per user in a (longer delta time), send e-mail alert to admin user manager.

Additionally we need a function to clear the password expiry (force the machine to set a new password) to be used after local password is given to an end user.

User Interface

Can we map user to computer name and give the user a process to recover password without calling HD? Or have the manager log in & be able to pull local administrator for their directs? Or some other way to go about actually reducing call volume.

Future Considerations

Excluding ms-Mcs-AdmPwd  from repl to RODC – really no point to it being there.

Do we get this hooked up for acquired company domains too, or do they wait until they get in the WIN domain?

Does this facilitate new machine deployment to remote users? If you get a newly imaged machine & know its name, get the local admin password, log in, VPN in … can you do a run-as to get your creds cached? Or do a change user and still have the VPN session running so you can change to a domain user account?

LAPS For Servers: Should this be done on servers too? Web site could restrict who could view desktops v/s who could view servers … but it would save time/effort when someone leaves the group/company there too. Could even have non-TSG folks who would be able to get access to specific boxes – no idea if that’s something Michael would want, but same idea as the desktop side where now I wouldn’t give someone the password ‘cause it’s the password for thousands of other computers … may be people they wouldn’t want having local admin on any WIN box they maintain … but having local admin on the four boxes that run their app … maybe that’s a bonus. If it is deployed to servers, make sure they don’t put it on DCs (unless you want to use LAPS to manage the domain administrator password … which is an interesting consideration but has so many potential problems I don’t want to think about it right now especially since you’d have to find which DC updated the password most recently).

LAPS For VDI: Should this be done on VDI workstations? Even though it’s a easier to set the password on VDI the base VDI images than each individual workstation, it’s still manual effort & provides an attack vector for all of the *other* VDI sessions. Persistent sessions are OK without any thought because functionally no different than workstations. Non-persistent with new name each time are OK too – although I suspect you end up with a BUNCH of machine objects in AD that need to be cleaned up as new machine names come online. Maybe VDI sorts this … but the LAPS ‘stuff’ is functionally no different than bringing a whole bunch of new workstations online all the time.

Non-persistent sessions with same computer name … since the password update interval probably won’t have elapsed, the in-image password will be used. Can implement an on-boot script that clears AdmPwdExpirationTime to force change. Or a script to clear value on system shutdown (but that would need to handle non-clean shutdowns). That would require some testing.

 

Testing Process

We can have a full proof of concept type test by loading schema into test active directory (verify no adverse impact is seen) and having a workstation joined to the test domain. We could provide a quick web site where you input a computer name & get back a password (basically lacking the security-related controls where # of requests generate some action). This would allow testing of the password on the local machine. Would also allow testing of force-updating the local admin password.

Once we determine that this is worth the effort, web site would need to be flushed out (DB created for audit tracking). Schema and rights would need to be set up in AD. Then it’s pretty much on the desktop / GPO side. I’d recommend setting the GPO for a small number of test workstations first … but that’s what they do for pretty much any GPO change so not exactly ground breaking.

Self Driving Cars (or Market Driven Algorithms)

I don’t see much of a future for self-driving passenger vehicles. There are two non-tenable options for crash avoidance algorithms. Either the algorithm prioritizes my life and property (which means it would kill someone else to save my life … good for me, bad for society) or it won’t (great for society, but am I going to pay money for a car that will literally kill me to save someone else?). Does the computer assisted human driving model suffer this flaw? An algorithm that engages the brakes any time there is an obstacle within X feet fails to consider the vehicle that is about to slam into the side of your car if you don’t move it into the shrubbery ahead of you.

Self-driving unoccupied vehicles can simply de-prioritize itself (and the owner needs to accept that financial risk). We may see driving as a service (DaaS?) where a real human is responsible for making these split-second decisions. But allowing people to achieve the metro experience in their own vehicle (i.e. you sit and work for half an hour whilst your conveyance delivers you to your destination) is probably not going to happen.