3

Ask HN: Concepts that clicked only years after you first encountered them?

 1 year ago
source link: https://news.ycombinator.com/item?id=34206219
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Ask HN: Concepts that clicked only years after you first encountered them?

Ask HN: Concepts that clicked only years after you first encountered them?
28 points by luuuzeta 1 hour ago | hide | past | favorite | 37 comments
I'm reading Petzold's Code [1], and it dawned on me that I didn't understand logic gates intuitively until now. I took a Computer Architecture course back in college, and I understood what logic gates meant in boolean algebra but not empirically. Petzold clarified this for me by going from the empirical to the theoretical using a lightbulb, a battery, wires, and relays (which he introduces when he talks about the telegraph as a way to amplify a signal).

Another concept is the relationship between current, voltage, and resistance. For example, I always failed to understand why longer wires mean more resistance while thicker wires mean less resistance.

[1]: https://www.codehiddenlanguage.com/

One of the biggest realizations recently for me was realizing that a nearly all of software development is basically about turning a slow, manual process into a faster, automated process. Modern CI/CD stems from a bunch of shell commands that somebody wrote and manually executed to test an app and upload it to a server. Modern automated software testing stems from humans writing small test apps and running them to confirm correct behavior. Many modern development practices stem from allowing small test apps to be written easier and faster. It's all just a giant manual process-to-automated process time-saving machine.
The power of an outline when writing.

Over the past few years, I've been teaching myself how to write better. I'm not talking about elementary syntax or grammar. I'm not talking about writing the traditional, American English five paragraph essay. I'm talking about writing longer pieces of prose, articles or blog posts or short chapters with word counts ranging anywhere between 1500-3000 words. On this journey of improving the craft, I realized that one of my biggest struggles was writing cohesively. Although I've been able to get lots of words on (digital) paper, eventually I'd get lost in my own web of thoughts, the article itself totally incoherent, no structure, no organization.

Constructing outlines and reverse outlines[0] has helped me tremendously. It's not easy ... but the concept itself is finally — years later — starting to click.

[0] - https://explorationsofstyle.com/2011/02/09/reverse-outlines/

Unit testing and using dependency injection to write test-able code.

I'm not sure if it was years, but it wasn't immediate. I just didn't understand why dependency injection was good at first, and not just someone's weird personal code style choice.

I thought it was just people being "Enterprisey" which I'd encountered many times over the years.

Once I committed to unit testing, I realized how necessary it is.

Unfortunately I still encounter customers who haven't bought into it and as a result have untestable code. It's so hard to go back and retrofit testing.

s.gif
Another solution is to eliminate classes and only use structs or similar plain objects. Makes mocking and testing functions much easier. At this point I see no reason for OOP whatsoever and consider it a big mistake.
s.gif
Seven years into my career I'm increasingly convinced that the emperor has no clothes with respect to unit tests that are just transcripts of the code under test with "mock.EXPECT()" prepended to everything - 95% by volume of the unit test code I've ever read, written, or maintained.
s.gif
Yeah, the mere presence of unit tests is not enough. It has to actually assert something useful.

When I code review, I try to make sure I call out "fake tests".

s.gif
Can you explain the concept? I feel I didn't grok it fully
s.gif
Not sure about poster above, but I found a large amount of value in writing tests when developing API backends. I knew the shapes and potential data. Was easier to write tests to confirm endpoints looked right than manually hit the api endpoints.
s.gif
> dependency injection

The term is unfamiliar to me -- is it related to "fault injection"?

s.gif
Let's say Class A depends on Class B. A lot of people have the instincts to have A construct B so that the callers of A don't have to worry about it.

But this makes testing A in isolation difficult. When testing A, you want to mock out B with an instance the test can manipulate.

So we want A to not create B, instead we want B to be "injected" into A. The general strategy of having B passed into A is called dependcy injection.

s.gif
The idea of DI is that you creat resources indirectly, and don't call the named constructor in your code. You then use interface types to interact with the resource.

This way, your code isn't bound to a specific implementation.

Some DI frameworks even go so far and define all resources in a config file. This way you can switch out the implementation without a recompilation.

s.gif
no, it's essentially passing the things that a piece of code depends on - for example, client objects - instead of instantiating them inside of the code itself. If you instantiate the client objects inside of the code, in order to write unit tests, you would have to intercept the client library load in the code under test in order to inject a mock. With dependency injection, you can pass a dummy object with the same interface as the client, it doesn't require any mucking around with library loading.
s.gif
No it is a design pattern to structure code. It is also known as the hollywood principle (don't call us, we call you). Meaning that dependencies of a class are provided from the outside, the class itself doesn't know how to create instances of dependencies, just relys on the dependency container (also named inversion of control)...
s.gif
It is a pattern where you inject the dependencies of a class into it rather than create the instances of the dependencies within the class. for more info --> https://en.wikipedia.org/wiki/Dependency_injection

it makes the code cleaner and testable.

Forward-Backward algorithm before there were all sorts of resources and explanations on how it works online.
The autonomic nervous system and the adrenal cortex. Homeostasis is taught as a textbook fact, the body reverting to a baseline over time. What’s not taught is how much the impacts of daily life events drive a continuous stress response. Fight or flight is not just a reaction to deadly threats. It’s active every moment of every day to ensure survival. The adrenal cortex is always active, to traffic, your boss and colleagues, relationship struggles, and overall health and wellness like sleep and nutrition. Yes, the system reverts to a baseline over time but how much that baseline varies is obvious in tracking resting heart rate.
s.gif
"Understanding Stresses and Strains" from 1968 presents a good depiction of the mechanism with familiar-looking cartoon characters: https://archive.org/details/understandingstressesandstrains

The caveman pressing the alarm button is something I think of a lot.

The use of Predicate Calculus in coming up with the Proof along with the Program.

Predicate Calculus is used to show that the path followed by a Process through a Cartesian Product space (created from all the memory variables in a Program) is the one you had in mind w.r.t. its Specifications. Suddenly you start to understand basic Set Theory, Types, Relations (Functions) and Logic.

Entropy. Both in information and in thermodynamics, and how brilliantly they are connected. The audiobook "The Big Picture" by Sean Carroll has helped a lot.
That professors were thinking real hard about which problem sets to give us in hopes that we would actually learn something.

I don't think I really understood that completely until I started TA-ing.

Lie Groups... I signed up for a grad level course in my second year, and had no idea what was going on. Eventually I did my PhD on algebraic combinatorics, which works with Lie Groups quite a lot, but it took years to internalize all the ideas needed to have any intuition at all.

https://en.m.wikipedia.org/wiki/Lie_group

i didn't really understand hypermedia, and, in particular, the uniform interface/HATEOAS until a few years after i started building intercooler.js (htmx predecessor)

https://intercoolerjs.org/2016/01/18/rescuing-rest.html

much later:

https://htmx.org/essays/hateoas/

Basic music theory.

Almost no one I encountered bothered to actually explain anything. They simply regurgitated things and I guess expected me to somehow intuitively understand something or other.

s.gif
Sadly, mathematics classes are like that as well. Instructors start throwing equations on the board, expecting us to somehow connect it all together. The best math textbook (Theory of Algebra) I ever read had little sections about the person who revealed a particular subject, why they were studying it, and how the subject is used.
s.gif
Do you remember what the book was called, or the writers.
s.gif
I recently learned what solfege actually is from a simple ChatGPT conversation after hearing about it for many years.
Discrete mathematics and all sorts of its application in real-world (software development) related problems. Also how any given solution to a problem in one problem domain can be transferred to a problem in another unrelated domain. Think Galois theory but waaaay less fancy :-)
Blue Noise dithering based on a HN post from @todsacerdoti. https://news.ycombinator.com/item?id=25633483 At my previous job we had an ASIC hardware block to implement blue noise dithering. No one, even the people who created it, could explain to me how I needed to use it. Years latter, I read their blog post and a light bulb went on.
It took a couple years in college for me to understand entropy.

Entropy in classical thermodynamics is presented in a mysterious way that leads to confusion.

Entropy in statistical thermodynamics, however, is logical. Once one understands basic statistical thermodynamics, entropy isn't mysterious.

The book in my statistical thermodynamics class was An Introduction to Thermal Physics by Daniel Schroeder, which is an excellent book that I've referred to many times since.

s.gif
I had to study entropy twice in college for different courses that were 2 years apart from each other, and I still remember this one quote I read somewhere:
  The first time you study entropy, you won't understand it. The second time you study it, you'll think you understood it until you realize you didn't. By the third time you study it, you just don't care anymore and just use it.
10 years after graduating, and I haven't encontered entropy again after the second time, so you can guess where I'm at in this quote. But thank you, for now I know how to attack it if I ever need it again.
The fourier transform. Encountered it first in my undergrad engineering degree, it was presented as dry mathematics, with no real explanation, just threw complex exponentials at us, and pages of derivations. Years later I actually use it in my job, and through that and other material can see its beauty, and how its actually not that complicated. Some great resources like this helped a lot:

[0] https://betterexplained.com/articles/colorized-math-equation...

s.gif
This is what made it click for me, can recommend it:

3Blue1Brown -- But what is the Fourier Transform? A visual introduction.

https://www.youtube.com/watch?v=spUNpyF58BY

The monitor model in second language acquisition. Or, more accurately, the more contemporary synthesis that it's developed into in the decades since it was first proposed.

The model itself is easy enough to grasp. But concretely understanding what it implies about how I should be studying took much, much longer.

s.gif
Very intriguing! Any links you can share on this theory? A quick Google search gave me an overview, but I don’t see how this is particularly useful for second language acquisition.
Systems in equilibrium. A lot of my college engineering courses had these (what seemed to me to be) hand-wavy assertions of equality and what seemed like just an assumption that the system would converge to that point.

I was probably in my late 30s or early 40s before I really grokked why that tended to be true. (I could blindly accept and grind through the equations to get the answers in college, but it was decades later that I developed a feel for why.)

Staying out of debt. A simple concept that took me until I was 37 to appreciate and understand.
blockchains as a single source of truth

always seemed like a shitty expensive database for 7+ years

s.gif
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK