Hacker Newsnew | past | comments | ask | show | jobs | submit | ler_'s commentslogin

Adding to this: too many negatives before making a point, which AI text is prone to do in order to give surface level emphasis to random points in an argument. For example: "I sat there for a second. It didn't lose the thread. It didn't panic. It prioritized like a real engineer would." Then there is the fact that the paragraph ends in just about the same way, which also activates one's AI-voice-detector, so to speak: "This wasn't autocomplete. This was collaboration."

In my opinion, to write is to think. And to write is also to express oneself, not only to create a "communication object," let's put it that way. I would rather read an imperfect human voice than a machine's attempts to fix it. I think it's worth to face the frustration that comes with writing, because the end goal of refining your own argument and your delivery is that much sweeter. Let your human voice shine through.


I could have written this comment myself. I was surprised to find out that it wasn't common to track every single thing you spend, to have multiple Excel sheets for your money, and to constantly look at how you are faring against your budget. Seems like a lot of folks look at their checking account and go "guess I still have some money left over to spend." Maybe it's because I grew up poor or because my mother was very careful with finances herself (in spite of hardship). I would be curious to know what your case is.


I also had this problem and one thing I found useful was to create a distinction between reflecting on something and referencing it. Writing things down helps me think, but it doesn't mean a note will be worth reading. What I began doing is creating a short summary for every note. That way, I only ever reference the summary. Reflections are for thinking, summaries are for referencing.


> The way Photoshop serves its market is quite specific and just casually loading up Photopea now I can see many parts of the app that would trivially frustrate a production workflow. I'm happy to list these for anyone curious.

This did make me curious. What problems did you notice? I know next to nothing about image editing software. However, I always like to understand how someone judges the usefulness of a given thing, assuming he or she has enough background knowledge / experience on a subject.


A non-expert workflow for photoshop is editing images in the CMYK colour mode and handling the use of specials such as pantone colours or custom ink mixes.

Photopea has a CMYK mode, which is great, but despite the document being in CMYK mode the swatches adjacent are RGB. A useful approach would be to automatically convert RGB swatches to the destination colour space, since that's how they look when one uses them.

I also noticed the layer blending modes were undertaken in the RGB colour space rather than CMYK despite the document colour mode setting. This difference produces different results especially if using the transparency modes screen and multiply which behave a fair bit differently in the CMYK colour mode versus the RGB colour mode. A person that opens the Photopea PSD in photoshop would thus see a different result which the Photopea user did not intend.

Next I noticed that the channel management for the CMYK plates and specials provided no ability to set or manage the specials plate. For example when utilising a pantone colour one would need to establish an additional plate and input the LAB value for the ink, there didn't seem to be any way of doing that, however it supported what was already in the sample PSD file I loaded into it.

I didn't delve into colour profiles and control over dot gain, but that is another aspect which is usually absent in Photoshop alternatives.

Also just rehashing that the above is not expert-level, anyone in the industry would be familiar with these concepts and find themselves needing to manage them for production output. This is what I mean when I say to people "you don't need photoshop, get something better", non-industry people don't need to be fussing over whether the document intention should be perceptual or absolute (or seldom used middle-grounds), what is the output LPI, or device pixel requirements, what byte order is supported by the RIP, what their ink weights maximums are, or adjusting screen angles to deter plate mottling.


Hi, you make all excellent points. However they do sound more like comments from 2002, not 2024. With the complete takeover of the web, digital media, and digital photography, just where do you imagine all this cmyk stuff is still happening? Save for the last vestiges of print, it isn't. Most work today goes from rgb, to rgb, and then to rgb. Maybe throw in sRGB as well. So cmyk, nice to have, but not needed for most pro work today. FWIW, 40 years pro work at top NYC ad agencies and pub houses.


>Hi, you make all excellent points. However they do sound more like comments from 2002, not 2024. With the complete takeover of the web, digital media, and digital photography, just where do you imagine all this cmyk stuff is still happening? Save for the last vestiges of print, it isn't. Most work today goes from rgb, to rgb, and then to rgb. Maybe throw in sRGB as well. So cmyk, nice to have, but not needed for most pro work today. FWIW, 40 years pro work at top NYC ad agencies and pub houses.

Hi LanceNY from the top NYC ad agency wink wink. I see why you've made a fresh account just to reply to this, because it's quite clear you don't have the experience that you claim, nor even an understanding of basic digital-only principles such as the difference between a colour mode and a colour space.

It also seems you're suggesting that print and manufacturing don't occur in 2024. Classic.

I dare say your experience in this industry is no longer than the time it took you to write that comment.


Thank you for taking the time to write this out! It's so interesting to see exactly where things become tricky once you go from hobbyist to professional -- or from regular hobbyist to fancy hobbyist, for that matter.


Thank you for bringing that up, understaffing affects everything and harms patients. No set of alarms will ever replace the benefit of having enough people working.


Knowing how to trend the patient's health is probably more useful than relying on all the alarms. People hardly deteriorate from one second to the next if you know what to expect from their baseline. At least that's what I did when working as a nurse. However, I never worked in some place like the ICU, so the approach might be different in that case.


Well, with AI people will still likely be suffering, except those at the top of the pyramid will be doing even better. Not so great. The AI debate is not about technology per se, but more so about seeing how gains in technology / productivity only go to the top, and extrapolating how bad that will be when greed meets the hyper-productivity of AI.


Really late to the party on this, but I don't see the purpose of remembering every day without fail. For my part, I have been journaling for a while now, and every day I make a note of something that happened. However, I don't get so caught up with particular moments.

If you have enough data, to put it this way, then you notice that most things you do, feel or think are somewhat circular. You come back to certain emotions, you come back to certain perceptions. Remembering by itself, in the absence of reflection, makes little sense. Plus, whenever I do glean something from my previous writings, it is usually due to understanding the general "feel" of a time in my life, not so much memorizing a particular day.

>The moments captured in my images are fresh, but my perspective on them changes.

I like that the author acknowledges that here. Again, though, I do think it matters more to interpret one's experience as opposed to having a perfect recollection of the facts. And, to be fair, what he does, and what I do, sounds more like a way to handle far more existential concerns. I suppose we want to hold on to life, to give it meaning in some way by always being a witness to what has been. Maybe we would all be better served by acknowledging the underlying philosophical conundrum here, haha.


I seriously don't get this argument, which I see far too often. Meaning, writing being simply a means to an "end" of a product. I like a good piece of fiction especially because it was written by a human being. It is that human's way of communicating something to me / the audience. Do people out there really want to read something written by a machine? Of course, if you have some bottom of the barrel corporate-approved fiction, it may as well be written by AI. But that isn't what I seek to consume and I would wager there are many others who feel the same way. What is the appeal of AI-generated fiction?


I don't usually comment but just wanted to say that I absolutely love Obsidian. I have been using it every day for about 2 years now and my life would not be the same without it. It's amazing. I also gladly pay for Sync, it's more than worth it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: