Hacker Newsnew | past | comments | ask | show | jobs | submit | masternight's commentslogin

I'm in the same country. Another big reason to use a CB over a phone is that it's illegal to touch a phone while driving. No such restriction applies to CB radios.

The tech is very much alive and well.


They're hard for me because the events that a lot of people consider achievements don't really stand out in my memory. Often I tend to forget they happened.

I've solved some programming problems that I considered quite mundane and unremarkable, yet others think it was some great achievement.

While it might have been hard at the time, in hindsight the events seem unremarkable and just me doing my routine duties.

> "Write about a time during your university studies in which you faced a difficult problem, and what you did to overcome it."

I guess the university example I could spin a story about how I failed a subject and had to repeat it and got high marks the second time round. The thing is I probably won't remember the event if I'm in an interview and under pressure.

When I started writing this post, I couldn't think of anything difficult that I had to overcome in my CompSci degree. It took me a while to even remember failing that subject, and in hindsight I don't have any emotional attachment to the event. It just doesn't stand out in my memory as remarkable or interesting or even difficult. I did change up my tactics the second time around and did quite well in the subject, so I have material for a story.

The problem is most of the time I don't even remember failing that subject. Even if I did remember, I'd probably dismiss it as I don't remember it being difficult.


This I am finding a problem. If you are a senior developer you are leading every day and doing senior things but it is like walking. I don't remember each step I took.

In the performance review you now need to say "On this Tuesday I needed to get from the salon to the baker so I initiated by motor neurone and walked out of the salon. This made me get there in 5 minutes which had the impact of my mum getting her cinnamon scroll" and you have to remember that happened. For those with worse memory this is an extra job. If you don't do it you get discriminated against.


in a job interview

"Tell me about a time when you tripped over while commuting."

"Tell me about a time when your feet touched each other during a walk."

"Tell me about a time when you were facing north-east and a bus passed in front of you." [follow-up question] "What type of bus was it? [suburban, long distance, etc] You say you saw it, so walk me by your visual experience."

If you have lived your life as a walking person, as you seem to imply by your comment, you surely have done these things multiple times, right? Failing to respond in a truthful and satisfactory manner will be counted heavily against you.


> will be counted heavily against you.

Yes, that's the problem

Though, as someone who's done a number of those interviews over the years, I'd replace the word truthful with manner that the interviewer regards as truthful


So I'm on the other end of the number line from the SDAM folks and I'm kind of mind blown that people don't remember when they trip, I remember at least 6 instances off the top of my head. Ditto feet touching each other - which shoes I was wearing, what the weather was like, where I was at. The bus question I would have to dig a little but I'm sure it happened at least once.


That's wild. I can remember categories of tripping. For example, I know that one of the more frequent ones involves stupid cats who don't realize that it's a bad idea to walk in front of a rapidly moving creature who outweighs them by a factor of 13. But I can't recall any specific instance of it.


I remember tripping as a child (ice cream truck...), in middle school (stairs), in high school (book bag), while getting coffee over a decade ago with coworkers (sciatica). I couldn't necessarily tell you which dates those happened on but could probably get it within a month or so.


I don't trip over that often, I remember like 1 time in the past 2 years I tripped over. Maybe people just don't trip over that much.


I mean I remember tripping when I was 5, 15, and 25, and I'm 42 now so I don't trip that often, I just remember everything.


I mean, I've definitely had all these experiences and know I have, but I couldn't tell you a single detail about anything of those moments. I've missed plenty of buses because I've had music blaring and I wasn't focused on the bus stop, but I definitely don't remember anything else about those moments other than they, at some point in history, happened to me.


I feel like fuck it going to be that frank in the interview. Or all my examples will be from the last 21 days. Do a lot of heroic stuff at work for 21 days to coincide with the interview!


There is something I like about win32 gui programming. It's a little idiosyncratic, but if you read Raymond Chen's blog you'll see why.

The win32 API has its origins on the 8088 processor and doing things a certain way results in saving 40 bytes of code or uses one less register or something.

I wrote a lot of toy gui apps using mingw and Petzold's book back in the day. Writing custom controls, drawing graphics and text, handling scrolling, hit testing etc was all a lot of fun.

I see in your app you're using strcpy, sprintf. Any kind of serious programming you should be using the length-checked variants. I'm surprised the compiler didn't spew.

You'll also find that the Win32 API has a lot of replacements for what's in the C standard library. If you really want to try and get the executable size down, see if you can write your app using only <Windows.h> and no cstdlib. Instead of memset() you've got ZeroMemory(), instead of memcpy() you've got CopyMemory().

At some point writing raw C code becomes painful. Still, I think doing your first few attempts in raw C is the best way to learn. Managing all the minutiae gives you a great sense of what's going on while you're learning.

If you want to play more with win32 gui programming, I'd have a look at the WTL (Windows Template Library). It's a C++ wrapper around the win32 API and makes it much easier to reason about what's going on.


> There is something I like about win32 gui programming

Totally agree with you. I use an excellent PC app called AlomWare Toolbox, and it's the epitome of Win32 design (https://www.alomware.com/images/tab-automation.png), and despite it doing so much it's only about 3 MB in size because of it. No frameworks with it either, just a single executable file. I wish all software were still like this.


Is the font-size adjustable? It's too small on my screen


Sorry, just saw this. Yes, there's a setting in the app to use a larger font.


At minimum, these days, if you dont use strncpy instead of strcpy, you’ll have to suffer through every man and his dog (or AI tool) forever telling you to do otherwise. (For me this is one of the main arguments of using zig, a lot of these common pitfalls are minimized by using zig, but c is fine as well)


Heh, and if you use strncpy() you'll have to suffer through me lecturing you on why strncpy() is the wrong function to use as well.


strncpy is more or less perfect in my line of work where a lot of binary protocols have fixed size string fields (char x[32]) etc.

The padding is needed to make packets hashable and not leak uninitialized bytes.

You just never assume a string is null terminated when reading, using strnlen or strncpy when reading as well.


Yep, that's intended use case for strncpy().

It's not really suitable for general purpose programming like the OP is doing. It won't null terminate the string if the buffer is filled, which will cause you all sorts of problems. If the buffer is not filled, it will write extra null bytes to fill the buffer (not a problem, but unnecessary).

On freebsd you have strlcpy(), Windows has strcpy_s() which will do what the OP needs. I remember someone trying to import strlcpy() into Linux, but Ulrich Drepper had a fit and said no.

You just never assume a string is null terminated when reading, using strnlen or strncpy when reading as well.

Not really possible when dealing with operating system level APIs that expect and require null-terminated strings. It's safer and less error-prone to keep everything null terminated at all times.

Or just write in C++ and use std::string, or literally any other language. C is terrible when it comes to text strings.


> On freebsd you have strlcpy()

strlcpy() came from OpenBSD and was later ported to FreeBSD, Solaris, etc.


Yup.

Lots of good security & safety innovations came from OpenBSD.


You shouldn't use any of those garbage functions. Just ignore \0 entirely, manage your lengths, and use memcpy.


I am not writing in C, but always wondered, why pascal-like strings wrappers are not popular, i. e. when you have first 2 bytes represent the length of the string following by \0 terminated string for compatibility.


2 bytes is not enough, usually you'll see whole "size_t" worth of bytes for the length.

But you could do something utf-8 inspired I suppose where some bit pattern in the first byte of the length tells you how many bytes are actually used for the length.


Pascal originally required you to specify the length of the string before you did anything with it.

This is a totally good idea, but was considered to be too much of a pain to use at the time.


In C you have to do that too, like... malloc()?


You still need a 0-terminated string to pass to API of most libraries (including ones included with the OS - in this case, Win32).


Yeah, Drepper said the same thing.


>It won't null terminate the string if the buffer is filled, which will cause you all sorts of problems.

if you don't know how to solve/avoid a problem like that, you will have all sorts of other problems

pound-define strncopy to a compile fail, write the function you want instead, correct all the compile errors, and then, not only move on with your life, never speak of it again, for that is the waste of time. C++ std:string is trash, java strings are trash, duplicate what you want from those in your C string library and sail ahead. no language has better defined behaviors than C, that's why so many other languages, interpreters, etc. have been implemented in C.


I thought string is just byte array that has Null as last element?

How can a string not Null-terminated ?


Whether the string ends in NULL or not is up to you as a programmer. It's only an array of bytes, even though the convention is to NULL-terminate it.

Well maybe more than just a convention, but there is nothing preventing you from setting the last byte to whatever you want.


Everything in C is just array of bytes, some would argue uint32_t is just array of 4 bytes. That's why we need convention.

A string is defined as byte array with Null at last. Remove the Null and it's not a string anymore.


> Everything in C is just array of bytes, some would argue uint32_t is just array of 4 bytes

That isn't how the C language is defined. The alignment rules may differ between those two types. Consider also the need for the union trick to portably implement type-punning in C. Also, the C standard permits for CHAR_BIT to equal 32, so in C's understanding of a 'byte', the uint32_t type might in principle correspond to just a single byte, in some exotic (but C compliant) platform.

No doubt there are other subtleties besides.


That's only one possible convention, and it's not a particularly good one at that.


Instead of memset() you've got ZeroMemory(), instead of memcpy() you've got CopyMemory().

I believe MSVC intrinsics will use the rep stos/movs instructions, which are even smaller than calling functions (which includes the size of their import table entries too.)


The standard allows memset/memcpy to be replaced by inline code. There is no need to use non-standard extensions to get a performance boost.


That's how the MSVC intrinsics work. Turn on the option and memset/memcpy, among others, gets replaced automatically:

https://learn.microsoft.com/en-us/cpp/preprocessor/intrinsic...


I spent a lot of time doing that and to be honest, I miss the ability to develop for native UIs with native code.


> You'll also find that the Win32 API has a lot of replacements for what's in the C standard library. If you really want to try and get the executable size down, see if you can write your app using only <Windows.h> and no cstdlib. Instead of memset() you've got ZeroMemory(), instead of memcpy() you've got CopyMemory().

I see he's also using fopen/fread/fclose rather than CreateFile/ReadFile/WriteFile/etc.


> I see he's also using fopen/fread/fclose rather than CreateFile/ReadFile/WriteFile/etc.

It's a todo list, not a network service. So what if it's using unbounded strcpy's all over the place? It has basically no attack surface. He wrote it for himself, not for criticism from the HN hoi polloi.

For once maybe take someone's work at face value instead of critiquing every mundane detail in order to feel like the smartest person in the room.

Computers are tools to get stuff done. Sometimes those tools are not pretty.

I place much of the criticism being levied here in the same category as the "we must rewrite 'ls' in Rust for security" nonsense that is regularly praised here.


So what if it's using unbounded strcpy's all over the place? It has basically no attack surface. He wrote it for himself, not for criticism from the HN hoi polloi

I didn't point that out so I could be the smartest person in the room and I certainly don't subscribe to the whole rewrite-the-world in rust.

The sheer amount of time I spent debugging problems caused by buffer overruns and other daft problems is immense. It's literal days of my life that could have been saved had safer APIs been created in the first place.

It's a cool toy program and I encourage the learning but maybe let's try and avoid unnecessary problems.


>I certainly don't subscribe to the whole rewrite-the-world in rust.

Good because those Rust people get really upset when you point out that Rust mostly seems to exist for people to "Rewrite X in Rust".


To be fair, CreateFile etc are a lot more verbose than fopen.


Oh yes, all those parameters are absolutely a pain to work with. But it can still be good to have an understanding of what options are abstracted away by fopen etc. Trying to write an app only using <Windows.h> can be a good learning exercise if you want to understand the fundamentals of the OS.


I agree with most of this, but let's be honest, win32 gui programming (like this) is/was a pain

Even MFC barely took the edge out. It's amazing how much better Borland built their "Delphi like" C++ library.

> Instead of memset() you've got ZeroMemory(), instead of memcpy() you've got CopyMemory().

Yes. And your best API for opening (anything but files but maybe files as well) is... CreateFile

Aah the memories :)


> It's amazing how much better Borland built their "Delphi like" C++ library.

As I recall, it wasn't "Delphi like", but rather literally the same VCL that Delphi used. That's why C++Builder had all those language extensions - they mapped 1:1 to the corresponding Delphi language features so that you could take any random Delphi unit (like VCL) and just use it from C++. In fact, C++Builder could even compile Delphi source code.


Yes, and it grew out of Object Windows Library, which also add extensions, and was definitly much more pleasant to use than MFC has ever managed to.

No need for the past tense, both products are still on the market with frequent releases and developer conferences, even if no longer at the same adoption level.


I remember OWL being somewhat weird in that it rendered quite a few stock controls itself, in ways that made OWL apps really stick out on Win 3.11.

MFC gets a lot of flak, but I think that a large chunk of it is undeserved because it's a fundamentally different kind of framework - a wrapper that tries to streamline the use of underlying APIs without concealing their fundamental nature, whereas OWL and VCL (and VB6, and WinForms) are higher-level wrappers that do quite a lot for you even when they use native widgets under the hood. From that perspective, if anything, the more appropriate criticism of MFC is that it tries to do too much - e.g. that whole document/view thing is clearly a high-level abstraction that always felt out of place to me given the overall design of the framework. WTL is basically what MFC tried to be but failed.


That was optional, it just turned out too many people liked to enable custom rendering.

That is exactly the reason why many of us dislike MFC, its low level wrapping of Windows APIs.

With OWL you could already have a kind of C# like experience, but in Windows 3.x, that is how far ahead the experience was versus MFC.

This is the tragedy of C++ frameworks, the tooling could be as good as VB, Delphi, Java, .NET, but then we have a big chunk of developers that insist in pushing for low level Cisms instead.

Honestly I never see much uptake on WTL, especially because dealing with ATL was already bad enough.

From the outside it feels like the chain of command at Microsoft has some big issue with producing great GUI development experiences for C++ developers.

When they finally nailed it, having a C++ Builder like experience, it was killed in the name of extensions by a rebel group, that nowadays is having fun writing Rust bindings for Windows APIs.


> Instead of memset() you've got ZeroMemory(), instead of memcpy() you've got CopyMemory().

What is or was the purpose of providing these instead of the existing Windows C std?


It's worth remembering that Windows 1.x and 2.x predates the C89 standard. This also explains why WINAPI calling convention was inherited from Pascal instead of C. The C standard library was "just another competitor" at the time.


The WINAPI calling convention is a cross between C and Pascal - C-style order of arguments on the stack, but Pascal-style callee cleaning the stack before return.

The reason for its use in Windows is that it makes generated code slightly smaller and more efficient, at the cost of not supporting varags easily (which you don't need for most functions anyway). Back when you had 640 Kb of RAM, saving a few bytes here and there adds up quickly.


Those functions explicitly? I can't find any definitive explanation on why they exist.

It looks like nowdays ZeroMemory() and RtlZeroMemory() are just macros for memset().

Here's an article on some of the RECT helper functions. Relevant for the 8088 CPU but probably not so much today: https://devblogs.microsoft.com/oldnewthing/20200224-00/?p=10...


Windows didn't standardize on C. It was mostly assembly and some Pascal in the beginning with C and C++ later.

Microsoft have always viewed C as just another language, it's not privileged in the way UNIX privileges C. By implication, the C standard library was provided by your compiler and shipped with your app as a dependency on Windows, it wasn't provided by the operating system.

These days that's been changing, partly because lots of installers dumped the MSVC runtime into c:\windows\system and so whether it was a part of the OS or not became blurred and partly because Microsoft got more willing to privilege languages at the OS level. Even so, the Windows group retains a commitment to language independence that other operating systems just don't have. WinRT comes with lots of metadata for binding it into other languages, for example.


> Windows didn't standardize on C. It was mostly assembly and some Pascal in the beginning with C and C++ later.

No, it was never Pascal. It was always C from the beginning. You may have been confused by them using the Pascal calling convention because it was generally faster on the 16-bit CPUs of the time.


Apple was the one going with Pascal for the OS, originally the Object Pascal linage was started at Apple, in collaboration with Niklaus Wirth that gave feedback on the design.


You could write code without using libc / the C runtime. You still can.


Unlike Unix, Windows historically didn't have a standard C runtime at all. Stuff like MSVCRT.DLL etc came later (and are themselves implemented on top of Win32 API, not directly on top of syscalls as is typical in Unix land).


I second this, and just want to add that strsafe.h contains replacements for the runtime string routines.


Back end coder for about 18 years.

I don't have one, and no employer in an interview has ever really expressed interest in something like that.


In the interview that got me my current job, the manager brought in two developers I would be working with. One of them had found my GitHub repos where I had some small personal projects, so he was able to see some evidence that I knew what I was doing.

So that was useful, but I don't think it would have helped to have had anything more portfolio-ish than that.


So, Autocomplete, Lang Server and Copilot.

I don't use any of those, and I've never felt the need for them really. Is Make and Vim not an IDE?

None of those existed when I started to code at my first dayjob and I've never really seen the value in them.

I find it interesting you think that go-to-definition isn't possible without a language server. I was doing that long before lang servers existed.

When I want to look a function signature up, well there's two ways I do it.

1) vim + ctags and Ctrl-] will take you there (usually, sometimes it gets confused).

2) grep (or nowdays, ripgrep) the codebase for the name of the function (in another window).

As for remembering what is where. Good organization helps. After time I tend to develop a mental model of what is where and I'll just find myself popping over to the other terminal and opening a file in vim to find what I need.

You can take syntax highlighting from my cold dead hands, though.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: