add reading reflections

This commit is contained in:
2025-11-21 10:30:19 -05:00
parent 6140120443
commit 471221602c
11 changed files with 188 additions and 0 deletions

View File

@@ -0,0 +1,18 @@
What points were the most clear to you? (List up to 3)
- Fundamentals of logic gates (or, and, not) made sense.
- Explaining how a NAND gate can make up the three fundamentals made sense.
Grading comment:
What points were the muddiest and you'd like to talk more about? (List up to 3)
- The idea of how a multiplexer functions makes sense, but its building blocks were kinda fuzzy (required external research).
- Same with a demultiplexer, but I have a better understanding now.
Grading comment:
Reflect on what you read.
Give me a sense about what is connecting to existing knowledge
-OR-
Your "ah ha!" moments
-OR-
What is hanging off by itself, not connecting.
The concepts of chapter 1 felt relatively simple. I understood logic gates and truth tables because of my prior courses, namely discrete math. The snippets of HDL and API descriptions also made a lot of sense to me, as I work on more complex API structures and took CSCI 306 with assembly covered. The idea of this course is exciting- Ive worked on every individual segment before, but never have I gone from the fundamentals to a complete product. Theres usually layers of abstraction in my projects- working with existing frameworks, or stopping at a level where I show proficiency for a class assignment.

View File

@@ -0,0 +1,20 @@
What points were the most clear to you? (List up to 3)
- Tokenization makes sense- breaking code into symbols, keywords, identifiers
- Okay, parsing follows the grammar structure naturally
- The XML output is just a way to visualize the parse tree structure
Grading comment:
What points were the muddiest and you'd like to talk more about? (List up to 3)
- Why XML output instead of just building the parse tree directly in some sort of object/other format
- The LL(0) grammar thing - seems like we're deliberately making Jack simple to avoid lookahead (tokenization again! like AI!!!)
- How do we handle operator precedence if Jack doesn't enforce it? Lots of parentheses?
Grading comment:
Reflect on what you read.
Give me a sense about what is connecting to existing knowledge
-OR-
Your "ah ha!" moments
-OR-
What is hanging off by itself, not connecting.
This chapter shows how compilers work under the hood- tokenization breaks source code into basic elements, then recursive descent parsing follows grammar rules to build a parse tree. The XML output is just a demonstration tool to show the parser understands the program structure. This connects to my discrete math course- the grammars finally have a practical application beyond theory (finally! theory into practice! I thought this day would never come.). The LL(0) grammar design makes sense from an implementation perspective, though it feels wrong compared to real languages. Jack's deliberate simplifications (mandatory keywords, no operator precedence, forced curly braces) are clearly intended to make compiler construction easier to learn, but it ends up with a clunky syntax that I definitely wouldn't want to use in practice. Still, breaking this complex topic into manageable pieces (tokenizer + parser) makes a ton of sense.

View File

View File

@@ -0,0 +1,17 @@
What points were the most clear to you? (List up to 3)
- The concept of binary addition, and subsequent operations
- Differences between a half and full adder, and how to connect them
Grading comment:
What points were the muddiest and you'd like to talk more about? (List up to 3)
- The idea of an ALU made sense, but the concept of implementation was a little dense. I understand it now after working on the project for a bit.
Grading comment:
Reflect on what you read.
Give me a sense about what is connecting to existing knowledge
-OR-
Your "ah ha!" moments
-OR-
What is hanging off by itself, not connecting.
As with the prior chapter, this felt like review of CSCI 306- where we went over integer/binary addition, subtraction, etc. The idea of an ALU was touched on, but I feel like this is going to diverge a bit here- less in theory, more of implementation in this course. Again, Im excited to see where this goes.

View File

@@ -0,0 +1,17 @@
What points were the most clear to you? (List up to 3)
- Concept of clock cycles, separating out calculations into time units
- Flip Flops storing data
Grading comment:
What points were the muddiest and you'd like to talk more about? (List up to 3)
- How do we maintain a clock with our current system? Everything just computes instantly so far, how do we modify what we have?
Grading comment:
Reflect on what you read.
Give me a sense about what is connecting to existing knowledge
-OR-
Your "ah ha!" moments
-OR-
What is hanging off by itself, not connecting.
So my first thought- concurrency?? already? This doesnt sound good. But then I read on- starting to make sense. Clock cycles are expected, yet I thought theyd have come up with the ALU (but the ALU is simple enough). I remember building flip flops in Minecraft (not surprisingly), so that concept also makes sense. Other than that, the RAM seems to scale exponentially, but from an implementation perspective, it just seems like a bit of repetition- so that will be simple enough once the basic implementation is met. Were scaling up rapidly here, yet the concepts still somewhat remain in grasp.

View File

@@ -0,0 +1,19 @@
What points were the most clear to you? (List up to 3)
- Machine language is familiar, however the flavor is not.
- working with memory, memory vs data reg, loop/jump
Grading comment:
What points were the muddiest and you'd like to talk more about? (List up to 3)
- Back to assembly… can we not? (kidding, but also not really.)
- Input/Output- we have predefined registers out of the ram range?
- Working with the screen seems intimidating- well see how that goes. Any tips would be appreciated.
Grading comment:
Reflect on what you read.
Give me a sense about what is connecting to existing knowledge
-OR-
Your "ah ha!" moments
-OR-
What is hanging off by itself, not connecting.
As with project03, this project with assembly language is starting to overlap with CSCI 306- working with assembly. Although I didnt necessarily like that part of the course, I understand it, and Im hoping that will overlap now. The only problem I see myself facing in the near future is shedding my RISCV assembly tendencies and learning hack, which seems to have weird syntax comparatively. Otherwise, the instructions from a high level here make sense and appear to be straightforward.

View File

@@ -0,0 +1,18 @@
What points were the most clear to you? (List up to 3)
- four main sections of a CPU: ALU, registers, control, I/O.
- screen made sense, as I had to look this up before
Grading comment:
What points were the muddiest and you'd like to talk more about? (List up to 3)
- Where are we storing the ROM?
- is this OS going to have a scheduler?
Grading comment:
Reflect on what you read.
Give me a sense about what is connecting to existing knowledge
-OR-
Your "ah ha!" moments
-OR-
What is hanging off by itself, not connecting.
Starting to get a little muddy, but Im still holding on. The parts of the CPU I knew, connecting it all makes sense, but Im too used to linux and Im confusing layers of complexity. Abstracting all of the functions into sections of the CPU is whats going to hold it together, I think. Now its just taking the API spec and turning into something functional. We'll see how this goes.

View File

@@ -0,0 +1,19 @@
What points were the most clear to you? (List up to 3)
- Lots of hardcoding, but its okay here (software brain scared)
- Symbolic references seem like a fun challenge. Im sure Im the only person to say this.
- This is just translation- Instruction X equals value Y. Just a bit more complicated. Spec will help.
Grading comment:
What points were the muddiest and you'd like to talk more about? (List up to 3)
- I dont like hardcoding. Magic numbers are bad.
- Are we going to end up optimizing later? Especially since this is a low-power CPU.
Grading comment:
Reflect on what you read.
Give me a sense about what is connecting to existing knowledge
-OR-
Your "ah ha!" moments
-OR-
What is hanging off by itself, not connecting.
I feel like I say this every time- but this is where its getting good. We have the system, we have the language, now lets make the system recognize the language (kinda but not really, you get what I mean). Again calls back to CSCI 306, but we didnt write our own assembler, just explored how it worked. RISC-V, although simple, isnt as simple as this machine. Im worried about my resistance to magic numbers and hardcoding, although thatll be necessary here. I also am not so sure where to start. But thats a later problem.

View File

@@ -0,0 +1,21 @@
What points were the most clear to you? (List up to 3)
- The concept of a VM (I work with Java a ton, this made sense)
- Two-tiered architecture (translation between code and byte code, and another between byte code to machine language)
- Push and pop, the stack, eventually a heap (I guess not?)
Grading comment:
What points were the muddiest and you'd like to talk more about? (List up to 3)
- Why are we using a VM here? Isnt that inefficient? Shouldnt we be emulating something C-like instead of Java-like? Java is sloooow.
- Again, as I read more- why a VM? This is just an extra layer! Were not going for cross compatibility here, this is a weak machine!
Grading comment:
Reflect on what you read.
Give me a sense about what is connecting to existing knowledge
-OR-
Your "ah ha!" moments
-OR-
What is hanging off by itself, not connecting.
Im definitely (not) biased here. My software engineering brain has one of those rotating police sirens going off. Why a VM? Thats super inefficient here. We have a low power computer, were not optimizing for cross-platform code compatibility, were not even using a standard architecture! So why are we using this? To me, this looks like a useless intermediary step that exists just to complicate things.
That being said, learning about a VM language is a different story. And thats how Im rationalizing this seemingly bad decision. If this VM exists to teach us about programming language design, fine. But thats better left for CSCI 308.

View File

@@ -0,0 +1,19 @@
What points were the most clear to you? (List up to 3)
- Okay, so were going up a step. We made the VM turn into Hack ASM, but we need something to make the VM language files in the first place- were not writing direct to VM.
- So were translating Jack (java-ish) to VM! New syntax to learn, but its only 8 library files. That sounds doable.
- The “calling protocol” makes sense- similar to the call stack Im used to.
Grading comment:
What points were the muddiest and you'd like to talk more about? (List up to 3)
- Again- VM bad, extra obscurity! Good for education.
- how do we really define functions? Thats the library stuff I dont quite get yet.
Grading comment:
Reflect on what you read.
Give me a sense about what is connecting to existing knowledge
-OR-
Your "ah ha!" moments
-OR-
What is hanging off by itself, not connecting.
Were going up in the CS course stack! This time, a mix of 306 and 308 - computer systems, and programming language design. I understand how we are using functions now- converting a user-friendly programming language, to a vm intermediary, then compiling to assembly. This makes sense, and is exactly what I wanted to see in the course! I wanted to see this higher level connection of language to compiler to compiled- the VM step, no matter how much I disagree with it, is nice too.

View File

@@ -0,0 +1,20 @@
What points were the most clear to you? (List up to 3)
- Jack syntax is pretty straightforward- Java-ish but simplified, which makes sense
- OOP concepts translate well, even on this basic platform
- The standard library provides the building blocks we need for larger programs
Grading comment:
What points were the muddiest and you'd like to talk more about? (List up to 3)
- Memory management seems manual- are we doing our own malloc/free equivalent?
- Graphics programming at this level feels primitive…
- How do we debug Jack programs effectively without proper debugging tools?
Grading comment:
Reflect on what you read.
Give me a sense about what is connecting to existing knowledge
-OR-
Your "ah ha!" moments
-OR-
What is hanging off by itself, not connecting.
Finally! We're at the application layer- this is where it all comes together. After building everything from NAND gates to a VM, we can actually write real programs that do useful things. The Jack language feels like a stripped-down Java, which makes sense given my background. It's cool to see how OOP concepts work even on this simple platform. Im now realizing that Ive built a complete computing stack. We went from basic logic gates to writing applications with graphics and user interaction- insane when you think about it. Every layer we built is now being used - the ALU for calculations, memory for storage, the screen for output. What still bugs me is the realization that this is still pretty primitive compared to modern development. No garbage collection, limited graphics, basic I/O. But I know that's the point- I understand what's happening under the hood now. When I write programs normally, there are so many layers of abstraction I never think about.