Covid-19 – why are we so vigilant when seems like nobody has it?

So, there’s this other thing going on – the protests. And restaurants are open. And it feels like there is social pressure .. don’t wear a mask, you look stupid. Maybe just me. Feels like, “If this were a real problem, more people would have it.”

Human feelings are nice, but.. they don’t scale well.

What’s the right percentage of people who should be sick in order for it to “feel” right? Lets say.. 10%.

At 4.4 Million people in KY.. wait, no, that’s assuming even spread. Lets just go with, “In the metropolitan area that I live in, 10%”.

At 600,000 people, that means 60,000 people would be exhibiting symptoms.

According to the most recent numbers (link), 6700 active cases, 500 hospitalizations, 70 in the ICU. So about 7% hospitalized, and 1% ICU. So if 60000 folks where “active”, then 5000 hospitalizations, and 700 in the ICU. We have (link) in the magnitude of 2500 hospital beds across the state of KY and 1500-ish ICU rooms.. but that’s for 4.6 MILLION people. I can’t find the exact numbers for Louisville, but basically:

If “enough” people around us seem sick, then the hospital system would be overwhelmed and we’d be seeing the 10%+ death rates that were seen in other places that got overwhelmed.

So, if we win, it looks like: Nobody around us, nobody we know, gets the virus. And still a ton of people die.

I guess i’ll continue to wear my mask and try not to be a vector, even if 99.9% of the time, i’m not near anybody with the virus, and I’m probably not carrying it either.

Crystal Reports Fun

I haven’t posted, or talked much, about my new work .. basically, I joined a company whose primary job is things, and most work focuses around an ERP, and its all about stabilizing and optimizing flows of information around processes. Edit/Update: I wrote this in, like, February. A lot happened since then.. CoViD19 especially. It is now June.

One of my new skillsets is Crystal Reports. To which my developer peeps say Ewww, usually. Eh, it works. The main thing is, the ERP system that everybody is logged into, has a way to embed Crystal Reports into it. That ERP System (Prophet 21) is also backed by a SQL Server database, and has a decent table structure to it, and a decent user-defined-field add-on strategy to it, making many tasks.. sane.

However, navigating the environment for what actually works has been hard. There were not a log of prior art examples which did things at the architecture I would like, so I’ve had to fumble around. Here’s what does NOT work:

  • Using anything other than a particular driver (I forget which one) for connecting to SQL. Specifically, not to use the SQL Native driver. This is because when the ERP is hosting the report, they do a switcharoo on the connection to connect to the right “instance” of the ERP, and if you use the wrong driver, that doesn’t work. You don’t find out till you try to run the report from the ERP.
  • Directly using a stored procedure to do heavy lifting for a report. You can, and it auto-creates report parameters for you, but those report parameter types lead to less-than-optimal user dialog (think dates plus times instead of dates).
  • Using a parent crystal report to include a subreport to get around the previous thing. Works great for a crosstab, but page headers not so good in a grid style report. However, I am able to bind a parameter from the parent report through a calculated field to plug in to the subreport (and thus the stored procedure).
  • Also, if you have a parent report that only calls sub-reports, and doesn’t actually connect to the database itself, the ERP system doesn’t like that because it cannot find the database connection to override.
  • Not choosing a printer when designing the report. Apparently this affects font choices and Arial looks better than Device CPI 10.

Here’s what does seem to work:

  • I can use Views to encapsulate business logic, example “The View of xxx customers” where xxx is a particular program that customers can enroll in.
  • I can use stored procedures to D.R.Y., for example, the stuff to get the raw data of the number of designer frames sold per customer within a time period.
  • I can call stored procedures from a “command” custom SQL block from Crystal Reports. In that block, I can: declare @t table (…) and insert into @t exec SPROC {?param1} {?param2} to get data from a stored procedure.
    • For example, there are two reports: One is a CrossTab that breaks out customers across brands and # of frames, and another is a detail report. The detail report goes into how many DIFFERENT brands were sold + the number of frames sold — these numbers roll into a formula describing what % back the customer gets, per the rules of the contract(s). Both of these reports use the same stored procedure to get the underlying data.
    • However, using this method, I have not yet been able to go (user input) to (calculated field) to (input into command) to (call stored procedure), yet. On the other hand, I can do a lot of manipulation in T-SQL, so that should be fine.

I’m continuing to learn a lot of things.. next week looks like it will be learning the ways of our EDI interfaces with some bigger customers, like the names you’ll find at a Mall. (I don’t know how much I can talk or not about our customers). (Edit: It is preferred that I do not.)

Side note – We have added datadog to our infrastructure, and monitoring is making our lives better, I think. Separate blog post, but in summary: immediate notify on errors, and notify on lack of success. Except on weekends.

Looking back at this post I wrote 4 months ago.. wow, there’s so much detail I could go into about all the little things that I’ve learned and tweaked. Like some powershell to inspect bad emails in an SMTP dead letter folder. And a powershell to automate connecting my Cisco VPN connection. Messing with Redhat 8.1 re-learining all the unix things including SMB shares. However, .. that’s another post; coming up shortly.

Using Timesheet data from Harvest to create a Force Directed Graph Animation

This is too complicated to try to put into words, so there’s a screencast instead.

Video of the final product:

https://www.youtube.com/watch?v=_TJi9yAm_kM

Video explaining how to do it:

https://youtu.be/DrrA9sd6AAQ

Source: https://github.com/sunnywiz/HarvestToGephi/blob/master/HarvestToGephi/Program.cs

In text form:  C# code to convert a Harvest CSV extract into a node and edges CSV file.  Then in Gephi, import the two files, convert the start/end dates into an interval, and set up the prettyness.  Record a long video with lots of stabilization and then speed it up.

Sorting Watch Later by Video Length

EDIT – does not work anymore see comments

Something I do often – I sort my Watch Later list in Youtube by how long the videos are.  I think its “shortest video to get the quickest entertainment bang for my time” (which is semi sad).

I wanted to automate it.  Found several dead ends:   Youtube doesn’t let you use their App Scripts API to modify the Watch Later list, and the page doesn’t let you run a bookmarklet to load an external js file.

However, Chrome allows for snippets.. and some folks pointed out that I could use various other softwares (Greasemonkey? another name was given) for running arbitrary javascript on a page.

Ctrl-Shift-I, Sources, >>, Snippets

https://gist.github.com/sunnywiz/064a4d07f8469074356e73eba5dc1215

That’s my solution for now.   It gets what I need done.  I wish I could have done that more generically.

Thanks to Matt H. for pointing me down the path of Chrome snippets.  That dude is bright.

STAQS

As part of EvolveKY.org, I got to do a table and speak at STAQS – Southern Transportaiton Air Quality Summit – here in Louisville.  Learned a lot of things:

  • Mobile, Moves – models used in calculating air pollution.  I think these are permissible to use instead of sampling when reporting or calculating impacts of a project.
  • CMAQ – its a way of funding things
  • MPO  – Metropolitan Planning Organization.. can span many states and partial counties, example Kentuckiana and Cincinnati and St Louis
  • VW Settlement – learned a lot more about it, the kinds of things that States can fund or not fund.

Leaving this note to myself so I can remember this fun learning time.

Tree of Directions Revisited

image

We are in the process of possibly changing residences, so (during my coding fun times on Wednesday mornings with a friend) I revisited this project.  When I left it, it looked like this:

image

In fact, that was so not-cute that I had tried to go 2D with it instead.   

I revisited it, and went through several stages. 

First, I had to re-do how I called Google Maps – they no longer had a “free” tier that was usable, but they now had a “pay as you go” tier which equated to $5 for 1000 route requests.   No problem.  But, to keep costs low, I added a cache strategy so I didn’t ask for the same thing twice.  (Caveat: I think I forgot to turn off traffic, so different requests at different times were delayed by different amounts).

Then, I tried to create a surface underneath the plot.  I did this using some electrical engineering stuff I once helped somebody with – Here’s the plot + a surface underneath it.  Better, but not the awesome I was hoping:

image

I called that the “Minecraft” look.  (That’s a very rough draft with only 10×10, Its about the same at 50×50 except the squares are smaller). I went for something where I calculated the midpoint and drew a polyhedron from each of the four vertices, it looked a lot better:

image

(The previous were github previews of STL files; this is 3D Model Viewer built in to Windows).  

But, its still to .. Jaggedy.  So, I did a few things: First, I made it so rather than a spiderweb, I did a “ramp” effect (filling in underneath the path), as well as, I trimmed off all the residential streets (<30mph) at the ends of the routes.  This gave me a much better print, which is closer to what I had in mind when I started: something that showed “Which were the best ways to get places”:

image

However, I didn’t realize it, but I had done something even better.  Here’s Just the Ramps:

image

This is what I had been going for!  To heck with the surface print part!  You can see the mountain AND the detail!  This is commit a75a67 at https://github.com/sunnywiz/TreeOfDirection/.

Next direction — where I do the same thing for both addresses, and then figure out a way to do stuff in two colors (ie, two prints, but that join together).    In order to do this, I have to first force the bounding boxes to be exactly the same.

Also other things learned – 3D slicing has come a long way, I don’t need things to be perfectly combined as long as they don’t have holes in them.

pruning promotions in GMail

I finally got around to writing some code – turns out its super easy, there’s something called script.google.com, and you Resources|Advanced Google Resources and add the Gmail resource, and then you can write functions to prune your inbox arbitrarily:

function prunePromotions() { 
  var response = Gmail.Users.Messages.list('me',{'q':'label:promotions', 'includeSpamTrash':false});
  if (response && response.messages && response.messages.length > 0) { 
    // messages seem to come in from most recent to least recent
    Logger.log('inspecting '+response.messages.length+' messages'); 
    var collected = {};
    for (var i=0; i < response.messages.length; i++) { 
      var message = response.messages[i]; 
      if (message.id) { 
        var message = Gmail.Users.Messages.get('me',message.id);           
        if (message && message.payload && message.payload.headers && message.payload.headers.length > 0) { 
          for (var j=0; j < message.payload.headers.length; j++) { 
            var header = message.payload.headers[j]; 
            if (header.name=='From') { 
              var from = header.value; 
              if (!collected[from]) collected[from] = []; 
              collected[from].push(message); 
              break; 
            } // if header is from
          } // for each header
        } // if message.payload.headers
      } // if message.id
    } // every message
    for(var from in collected) {
      if (collected[from].length > 1) { 
        for (var i=1; i < collected[from].length; i++) { 
          var message = collected[from][i];
          Logger.log('trashing '+message.snippet); 
          Gmail.Users.Messages.trash('me',message.id);
        }
      }
    }
  } 
}

Unfortunately, I get a lot of promotions, and this only works on the most recent 100 messages or so, so it might be of limited use in this state.   But its a start.

Learned some IoT today

My head is swimming with new information, so dumping it here in a bit, but first, a meta- of what’s going on.

1.  I often feel “left behind” and “not having time to play with things”.  I decided to try to make it a priority by committing with a friend to spend an hour learning new stuff..  together.  So the idea is we spend half an hour on my project, and half an hour on his project, in a pair-program kind of way.

2. We did the first non-planning session of it today.  Well, it was still a lot of planning.  And what happened is:  the 30 minutes were just enough to get me unblocked and to learn which directions to go.. and then we switched over to his project.

3. I’m looking forward to revisiting this next week.

3.97.  if this works well over time, we might polish it up a bit and open it up to more folks. 

The Details. 

First, His Project:  (topic not revealed, that’s his story)

  • I saw Jupyter for the first time in action.  It was amazing.  I think I’ll use it the next time I need to go spelunk and chart in a RDBMS (instead of query, copy-to-excel, create chart)
  • Revisiting OpenCV for image manipulation and stuff
  • There might be some ML in that project’s future.  That’s up to him, of course.  My job is to provide interest and provide support and be a rubber duck.
  • I hope to learn more about running things in docker containers from him.

imageMy Project – Temperature logging for my A/C, Furnace — what I learned

  • I got an LED to blink and read a photoresistor via a WIFI connection (via particle.io and Tinker)
  • Kinda-sorta why we need resistors going to ground to make circuits work.  At least, I saw that repeated pattern.
  • What a breadboard is, and how it is wired
  • Digital vs Analog ~= 3.3V and ranges, and HI=3.3V.  Some specific pins have better A-to-D converters and can read ranges up to 4096.
  • The temperature sensor I have uses something called a One-Wire protocol, and I have to do a library-add to get the right library to read it.  It involves scanning for devices, multiple devices can be on the same wire.    However, there’s two ways of powering it – parasitic and non-parasitic.
  • It might be that when I plug power into the photon, if the other end is my computer, i might be able to debug print via a serial connection.   That will help with the one-wire scan.
  • Turns out particle.io does all kinds of cloudy-mc-cloud stuff so once i have my number, all i need to do is publish(“temperature”, value) and it heads up to the cloud.   Then if I add the integration to Azure IoT, it goes into Azure where I can do other things with it.  AWS doesn’t seem to have that integration.   There are other folks who do have ways to graph things.  I need to analyze patterns over a 6-month time period, involving the drop or rise in temperature from ambient to heated/cooled. 
  • For one-wire stuff, i need a 4.7k ohm resistor, and i’ll probably need breadboard cables.  I didn’t have them yet.  I ordered them.   Should be coming along with a few more thermistors.
  • Using one-wire, i might be able to have a separate sensor for “ambient temperature” so that i can diff the two easier and just take the delta.
  • I read up on the spec for the “Wiring” language, which is basically C/C++ but scaled down. 
  • I learned there’s a web, a desktop, a CLI, as well as a VsCode development environment.  The vscode one seems to have a debugger.  Don’t think I’ll need that, as long as I can get serial.print() to work.

Side note – I didn’t have LetsEncrypt set up to auto-renew my cert, so the https cert for this blog expired.  I had to go find my blog post to find the link to find the commands to renew it.   Done.

Thoughts on Driving a Tesla Model 3

image

I finally did it!  We were in Las Vegas, and my wife said yes, so I rented this Tesla Model 3 for a day to make a quick trip to the Hoover Dam and Lake Meade.

I’ve been dreaming about this car for a while.  I had done my research.  So without trying to classify / rate it overall, here’s the individual things I found:

Acceleration

I started out the day in Chill Mode, graduated about halfway through the day to normal acceleration. I only let it loose once going up an interstate ramp.   About 2x or better the G’s than my Nissan Leaf gives me, and way past 45.   Wow.   Pedal about halfway down = what I’m used to as “strong” acceleration.   Probably not something I’d use a lot.

Autopilot was amazing

Turn it on, and its like the lane I’m in suddenly developed invisible steel rails and the car started gliding on those tracks.  

  • Turns out I drive more to the right than I thought.  I kept trying to correct it.  Wife thought the car was doing a better job than me. 
  • Trying to force the car out of where it wants to go is about the same as..  guiding a car up onto a curb at a very slight angle?  Didn’t measure it, maybe a pound of force?   Easily done, but more than just a touch of the steering wheel.  When it gives, its almost a .. snap? But the car doesn’t veer.
  • The amount of agitation to give the steering wheel so that it knows you’re still there is more than what I normally apply to a steering wheel when I’m driving long distance – I have a very light touch.   It got pissy with me.  So yes, it ensures you are paying attention.
  • It tended to drift a bit when lanes merge in to your lane.
  • it tended to be confused and then snap to a lane when exiting and going down ramps.
  • Telling it to change lanes is awesome.  Its very sure of itself.
  • It tended to be very conservative when people pulled out in front of me in the city.
  • I had way more situational awareness about surrounding cars.   Its like:  see them in side mirror, see them in the display, see them pass me out the driver window, and the whole time: The car knows they are there.
  • I kept it at a follow distance of 5, which is 2.5 seconds.   And pretty much the speed limit. 
  • Even in the city, when not on autopilot, I’d have it in “smart cruise control” where it followed the car in front.  Very relaxing.  It could track the forward car even on windy roads.
  • There were some confusing times – after stopping exiting a ramp – not realizing it still wanted to drive, perhaps?  Or, the car is being a little paranoid, applying braking of its own, and I’m surprised so I apply more brakes as well.  
  • Its already good enough to be a game-changer, and its going to get better.  I want in on this action.   Everybody else, please catch up soon.

Brake, and Hold

Car slows down, press brake to come to a stop.  Remove feet from pedals.  Car stays there.  Like “Hold” mode in a Prius going up a Hill.  Very nice.   Very much a fan.

Single Pedal Driving

I was worried that there wouldn’t be a lot of “Neutral” coast.. but .. I didn’t miss it. I spent more time in auto-cruise-control adjusting speed with the thumbwheel.. and even when controlling my own speed, there’s a very solid feel of “keep this speed going on.”   

I did find that I’d lift my foot a second or two before I intended to slow down – starting to move my foot to the brake – and the car would slow down, startling me.   Probably a few days would get rid of that habit.   Its a different brain circuit.

I’d say first 5-10 minutes are “WTF”, and then its “oh this is nice” + “Whoa why did I do that”, and maybe in 2-3 days it would be second nature.

Steering

It felt like I was holding a heavy steering wheel, which was attached with super-strong titanium links to the wheels.  It felt like I could “feel” the wheels through the steering wheel.  Any little input I gave, immediate response.  Is this called “tight”?  And it wasn’t even in Sport mode.  It was amazing.

Drive, Park, and Walk Away.  There is no Off

This was very freaky.  Yeah, stop the car, put it in Park, get OUT of the car, and lock it.  The car is now “Off”.   Unlock the car.  The car is now “On”.   There is a way to turn the car off while you’re in the car, but its not a common use case.

Range Anxiety

What range anxiety?   Yeah, I don’t even know how far we drove, I’d have to look it up.  No worries about the A/C being on, either.   We got it with 228 miles available, drove it down to .. 168? charged it up to 260.

Supercharging

Easily found.  Easily plugged in.  And then … it starts to ramp up the amps.. and .. OMG, 300 miles in an hour!  (initially). I think we stuck an additional 90 miles on in about 40 minutes.  20 of those minutes were wandering around a convenience store.  It is much more relaxed than when you are getting gas, because you know you’re going to be there a bit.  Which is how I’d prefer to roll – relaxed.  Definitely getting the right snack.  I look forward to taking a long distance trip in a Tesla with Autopilot.

Voice Navigation and Navigation in General

Better than any other car I’ve been in, on par with what I expect Google Maps and Apple Maps to be.  Probably closer to Apple Maps in terms of accuracy – it did miss the start of one offramp entirely, and chose the less optimal way into a parking lot, and the long way around involving a roundabout when a simple left was available.

The Car Itself

I think I’d rather get a Model Y.    The Model 3 was too close to the ground.   And almost too cute.   Probably go with gray or black, red is just not me.   Wife says Blue.

Using Turo

I think I’d recommend trying it out.  Granted, this host was maybe better than most?  He had an assistant who took care of all the details and made everything super simple.  Its about the same as how it felt going from Taxi’s to Lyft/Uber .. as it is going from Avis to Turo.  The same corollary applies – its a one to one, pre-meditated thing, with more details to figure out – vs a “go to one place and just get a car” (equivalent to go to one place – at a hotel taxi line – and get a taxi).   While renting, I had insurance cards provided by Liberty Mutual on my phone.    There’s a checkout and check-in process that involves documenting the state of the vehicle.   

At the end of the day, I was sad we didn’t have more errands to run.  

Nissan Leaf: Estimating Range

I was driving myself crazy. I’d keep trying to guess how my car was doing. Did an experiment to put my brain to rest:

  • X-Axis is Battery% – that’s why its right-to-left.
  • Y-Axis is either Trip-Miles or Estimated Miles, same units
  • This is a 2015 Leaf, 11 Capacity Bars. I don’t have LeafSPY available at the moment (someone borrowed the dongle)
  • Eco Mode, temp 48 degrees (or so), dry.
  • Using Cruise Control at speed limit whenever possible.
  • The “Lap” was a 12.4 mile loop with TJ Unitarian at one end. Relatively flat; easy traffic; and fast (for me) charger to get me to 100% to start.
  • Laps 1 and 2 were without heater or lights; as was Lap 4.
  • Lap 3 – I turned on headlights, seat heater, and steering wheel heater. (but not the main heat)
  • Between Lap 3 and 4 – car sat for 2 hours as I attended the EvolveKY MeetUp.
  • Lap 4 done identical to laps 1 and 2 to verify pattery
  • Lap 5 – I turned on the heat (and headlights). It was 48 outside; inside set for 70, so only a 20 degree jump. Note that inside of car had been heated by sunlight already, not a cold morning start.
  • Lap 6 – I got the LBW just as I started it. Different route for the second part – I headed home.

Observations:

  • The car is optimistic for the first couple of miles. (green)
    • It would have gotten 84 miles, maybe, if it had stayed like that.
  • Eyeballing battery use and odometer leads to a pretty solid line – better estimation than the Guess-O-Meter which jumps ALL over the place.
  • I expected the heater to shunt the line over, but i did NOT expect it to “recover” (pink).
  • Looks like when LBW kicks in, a different estimation starts to be used that is more pessimistic. (yellow)
  • Then it stops giving numbers.
  • I got home at 74.6 miles, but later without charger drove around the block a few times, up to about 76 miles. Never got to Turtle. I suspect (based on age + capacity bars) my battery is mis-calibrated, and there’s more than it thinks there is.
  • The first 3 laps all showed 4.5 miles per KWH. Using that number, puts me at 16 to 17 kWH of available battery for sure.
  • Using Seat Heaters and Steering Wheel Heaters .. could NOT tell the difference in the range. Yay!

I had been hoping that it would be colder.. my “range calculation” in my head (looking at odometer for 90%, 80%, 70%, etc) was giving me a 50 mile range or so. However, it was a nicer day.. Or the car knew it needed to perform.

If anybody wants to watch the boring data videos, they are https://youtu.be/Cjv03-F8TqI and https://youtu.be/nfHJhwyg4xs (assuming that the upload succeeds). Captured with a Hero6 in Timelapse Video mode.