

You make good points, but I doubt you’d continue to feel that way if you were a shareholder
You make good points, but I doubt you’d continue to feel that way if you were a shareholder
It’s always funny to see periodicals talk about Valve like they’re a normal puplicly traded tech company.
Valve is private. That fact alone is neither inherently good, nor bad. What it means though, is that Valve will very likely behave very differently than other companies in the same market. Heck, I very much doubt half life Alyx would exist if they were public. If we get HL3, it will likely be a similar case.
…but some nerds are more equal than others.
As an incurable optimist, I look forward to the day digitally licensed media goes under, and analog media makes its grand return
import numpy as np
temp = np.array([22, 21, 25, 23])
sd_temp = np.std(temp, ddof=1)
print(sd_temp)
Vs
temp <- c(22, 21, 25, 23)
sd(temp)
How in the world is R more clunky than python?
Edit: and I didn’t even mention how python likes to break unrelated software packages whenever I’m forced to use it.
I’ll never understand why my classmates prefer python to R.
Not to mention python has a tendency to influence things outside of its domain. I’ve configured my software repos to never update any packages containing python scripts or dependencies, because every time python updates, there’s a chance all those packages will stop working.
Safe code is a skill, not a feature.
The secret to success in software engineering:
I’ve heard an ex microsoft employee said in a blog once that the windows team has no seniors. Anyone who has worked there for one or two years has left for better employers. Nobody knows how to refactor or maintain old codebases, so instead, they just write new things on top of the old things. The windows kernel has hardly changed since XP.
If choice is our metric here, why not C? That way, you have the choice to use your own implementation of OOP
Nah, coding is one of the few things I don’t find annoying, so long as the language or toolsets I’m using allow for freedom. What I find annoying is when some talking head says all code should be a certain way, and everybody believes them for some reason.
It would be extremely annoying to be forced to write all my code functionally.
But I find it even more annoying to be forced to write all my code object oriented. Looking at you, python and java.
I’ve wondered why programming languages don’t include accurate fractions as part of their standard utils. I don’t mind calling dc, but I wish I didn’t need to write a bash script to pipe the output of dc into my program.
deleted by creator
This is an affront to nature. Comments shouldn’t even make it past the scanner.
Electrical Engineering really is a no-frills field; you either can do it, or you can’t. Our only testing methodology is this: if they know what they’re doing, they’ll pass and do well in the major. If they don’t know what they’re doing, they’ll fail and rethink their major.
Knowing what they’re doing is the important part. If it were the case that genAI chatbots helped in that regard, then we’d allow them, but none of us have observed any improvement. rather they’re wasting time they could be using to progress in the assignment to instead struggle to incorporate poorly optimized nonsense code they don’t understand. I can’t tell you how many times I’ve had conversations like:
“Why doesn’t this work?”
“Well I see you’re trying to do X, but as you know, you can’t do X so long as Y is true, and it is.”
“Oh, I didn’t know that. I’ll rewrite my prompt.”
“Actually, there’s a neat little trick you can do in situations like these. I strongly suggest you look up the documentation for function Z. It’s an example of a useful approach you can take for problems like these in the future.”
But then instead of looking it up, they just open their chatgpt tab and type “How to use function Z to do X when Y is true.”
I suppose after enough trial and error, they might get the program to work. But what then? Nothing is learned. The computer is as much a mystery to them after as it was before. They don’t know how to recognize when Y is true. They don’t know why Y prevents X. They don’t understand why function Z is the best approach to solving the problem, nor could they implement it again in a different situation. Those are the things they need to know in order to be engineers. Those are the things we test for. The why. The why is what matters. Without the why, there can be no engineering. From all that we’ve seen thus far, genAI chatbots take that why away from them.
If they manage to pass the class without learning those things, they will have a much, much harder time with the more advanced classes, and all the more so when they get to the classes where chatgpt is just plain incapable of helping them. And if even then, by some astronomical miracle they manage to graduate, what then? What will they have learned? What good is an engineer who can only follow pre-digested instructions instead of making something nobody else has?
I mean, they don’t generally keep their use of chatgpt a secret. Not for now, anyway. Meanwhile, the people who do well in the class write their code in a way that clearly shows they read the documentation, and have made use of the headers we’ve written for them to use.
In the end, does it matter? This isn’t a CS major, where you can just BS your way through all your classes and get a well paying career doing nothing but writing endpoints for some js framework. We’re trying to prepare them for when they’re writing their own architecture, their own compilers, their own OSses; things that have 0 docs for chatgpt to chew up at spit out, because they literally don’t exist yet.
Managers hoping genAI will cause the skill requirements (and paycheck demand) of developers to plummet:
Also managers when their workforce are filled with buffoons: