Digest: E-Myth Revisted

Digest The E-Myth Revisted Note: This is the second version of the book, per the “revisited” suffix. The Entrepreneur vs The Manager vs The Technician This concept is probably the most powerful takeaway I had from the book and is often referred to as “The struggle of the Technician”. I’ll describe this below. The author breaks down the positions in a business into three main roles, each of which has their own “boundary”:...

September 16, 2021 · 24 min · Greg Hilston

Digest: The Lean Startup

Digest The Lean Startup: How Today’s Entrepreneurs Use Continuous Innovation to Create Radically Successful Businesses I recently finished The Lean Startup by Eric Ries and in attempt to reinforce what I learned and to share with the world, I figured I’d document the core concepts I took away, as well as sharing all the highlights I made. The Build-Measure-Learn Loop One of the first concepts Eric goes over is what he calls the “Build-Measure-Learn” loop and its one of the most prominent pieces that startups can do faster than large companies....

August 11, 2021 · 18 min · Greg Hilston

Digest The Power Of Full Engagement

Digest of the book The Power of Full Engagement The following notes/exerts are from the book “The Power of Full Engagement: Managing Energy, Not Time, Is the Key to High Performance and Personal Renewal” by Tony Schwartz. It can be purchased here I’ve been reading through this book and using each chapter as content for leading discussions. I’ll continue to record my favorite quotes from each chapter and provide a link to a small presentation that each quote exists on its own slide....

December 1, 2020 · 1 min · Greg Hilston

Digest: Statistics

After reading Statistics, 4th Edition, I found myself with plenty of highlights from the book. As an exercise to help commit my highlights to memory, I’ll be documenting them here. This post is pretty much just for myself but I figured I’d share it online on the off chance it helps someone else out. Chapter 1: Controlled Experiments “treatment group”: individuals given the drug that’s being tested “control”: individuals that are not treated “double blind”: neither the subjects nor the doctors who measure the responses should know who was in the treatment or control group the treatment and the control group should be as similar as possible “randomized control”: an experiment where an impartial chance procedure is used to assign subjects to treatment or control groups “placebo”: having the subject believe the received real treatment, when in fact they’re receiving nothing using a randomized double blind design reduces bias to a minimum badly designed studies may exaggerate the value of risky surgery using randomized control ensures the control group is like the treatment group this way comparisons are only made among patients who could have received the therapy “adherers”: the individuals who took their drugs as prescribed Chapter 2: Observational Studies “summarize the Pellagra disease story”: “originally thought that disease carried by flies, as the flies infested homes were more prone to the disease turned out it was diet based and the poorer homes had this poorer diet” association !...

November 25, 2020 · 8 min · Greg Hilston

Deep Learning with PyTorch: Optimizers

This is based on code from the following book The follow blog post walks through what PyTorch’s Optimzers are.e Link to Jupyter Notebook that this blog post was made from Pytorch comes with a module of optimizers. We can replace our vanilla gradient descent with many different ones without modifying a lot of code. %matplotlib inline import numpy as np import pandas as pd import seaborn as sns from matplotlib import pyplot import torch torch....

October 29, 2020 · 49 min · Greg Hilston

Deep Learning with PyTorch: Autograd

This is based on code from the following book The follow blog post walks through what PyTorch’s Autograd is. Link to Jupyter Notebook that this blog post was made from %matplotlib inline import numpy as np import torch torch.set_printoptions(edgeitems=2) Taking our input from the previous notebook and applying our scaling t_c = torch.tensor([0.5, 14.0, 15.0, 28.0, 11.0, 8.0, 3.0, -4.0, 6.0, 13.0, 21.0]) t_u = torch.tensor([35.7, 55.9, 58.2, 81.9, 56.3, 48....

October 28, 2020 · 3 min · Greg Hilston

Deep Learning with PyTorch: Parameter Estimation

This is based on code from the following book The follow blog post walks through what Parameter Estimation is. The goal here is to explain the theory. I have rewritten this notebook from the above book’s PyTorch Tensor implementation to be just in pure Numpy. Link to Jupyter Notebook that this blog post was made from Download FILE The story here is we’ll learn about Parameter Estimation by pretending we have two thermometers on our desk....

October 27, 2020 · 8 min · Greg Hilston

Digest: The Clean Coder Book

I recently finished reading the book The Clean Coder, by Uncle Bob and decided to publicly share my notes. To be honest, I wasn’t stoked about this book, as I was told it was not technical. I was pretty skeptical that I’d take anything away and was dreading reading it. I couldn’t have been more wrong. Not only was the book very pleasant to read, but also chock full of great advise that I believe I’ll remember for the entirety of my career....

June 24, 2019 · 21 min · Greg Hilston