Hacker Newsnew | past | comments | ask | show | jobs | submit | alwayshumans's commentslogin

The vast majority of navigational incidents are caused by human error, but better systems would give the crew a chance to realise their mistakes

At the moment lots of maritime crew have alarm fatigue, the system constantly warns them of danger so they learn to ignore alarms and when something super serious does happen, they can ignore it


So one of the fundamental design elements of a electronic chart display information system(Ecdis) or warfare Ecdis is the use of the safety depth contour.

Anything inside the blue area would cause alarming on a traditional Ecdis.

The bigger issue is the data that is used to navigate on, I could write a very boring blog detailing why th systems work so badly...


you should! let's geek out on it


would love to chat with you about it. i think we should solve this problem.


I wonder if it's just a random manager who approved the donation.

For those who don't have experience of big orgs 5k is often in the region where you don't need to go higher for approval or involve procurement


From my experience business travel is as much about sharing an experience as it is the discussion or dissemination of information. That's a hard thing to replicate


Until technology can replicate the experience of staying out late, drunkenly chatting with coworkers in the back of a taxicab as you ride through Manhattan, then grabbing a midnight snack at a hole-in-the-wall diner someone recommended back at the office, business travel will have its place.


My theory is that physical proximity means danger or love to our animal brains .. and knowing it's tech will disengage your brain from feeling close and will change your emotions and engagement to the other person. Now something more natural than a LCD screen might make video calls a bit more lively and efficient.


The real value of business travel is what you do with people outside of work hours.


and business kickbacks and shared drinks at the hotel bar


This just feels like the latest in a long line of industries trying to build a recurring revenue model fueled by greed.

Looking at the uproar from moves by Circut[1] and others makes me question how out of touch some of these companies are.

I think companies need to realise a subscription needs to deliver constant tangible value to the consumer.

1.https://connpirg.org/blogs/blog/usp/angry-crafting-moms-blow...


This would not have been a problem (or even an option for the company) if the software was open source.


What3words is a clever bit of research that should never have been spun into a company.

It's just an implementation of a discrete global grid.

I don't think it is live anymore but there used to be a parody called What3fucks, you can probably guess what type of words it used


You can already host vector tiles within S3 that work with Mapbox and other libraries.

Not really sure what problem this is trying to solve.


The typical way to build and distribute Mapbox vector tiles has been by packing them into a sqlite database with individual rows for each Z/X/Y quad tree coordinate. This is what tools like tippecanoe typically produce.

The problem with this is it still requires a running server process colocated with that sqlite db in order to service requests for individual tiles, like what you'll receive from a client-side mapping library.

This project has a lot of different parts, but one of them is a spec for serving this type of data at low latency without a server by combining a custom packing format with S3 range-get requests to read individual tiles direct from blob storage. So that is certainly interesting for this kind of use-case.


In a typical example isn't this what route 53 solves?


OP author here; one thing I didn't touch on is the way that most map services license their basemap data explicitly disallows "caching" which would be inclusive of hosting basemaps yourself on your own storage. So for the cases where it makes engineering or product sense to have your maps on S3, it goes against the dominant map business model.


Is this a simple process? Can you share a link to how this is done, I'm interested. I always assumed it was fairly complicated.


I’ve used the utility tippecanoe with success to generate vector tiles from geojson sources and am hosting them statically. It’s been overall pretty easy. The only gotcha I can remember is I needed to pass the no compression flag since mapbox gl can’t read compressed tiles (and the file host will gzip everything anyways).


What issues have you run into with gzipped tiles? I store a lot of gzipped individual tiles for use in Mapbox GL maps on S3 with no problem - the only necessary change was making sure that S3 was emitting `content-encoding` headers.


I couldn’t get the tiles to show up at all when they were compressed by tippecanoe. I’m not sure what the problems is, to be honest.


At least for simple GeoJSON you can use tippecanoe (https://github.com/mapbox/tippecanoe) with --output-to-directory to get individual .pbf tiles. Most rendering clients will have some scheme where you can provide a root url and then the tiles need to be stored in some defined structure beneath that (e.g. root/z/x/y.pbf)

(posted something similar to another response)


One interesting thing is the toll often fluctuates due to the oil price.

If the toll is deemed to be too high a vessel always has the option of going around the Cape.

Realistically it's always an option but the added time plus the additional wear on a vessel makes it undesirable.


Starlink has the potential to completely revolutionise the shipping sector. Digitisation in the maritime space has been slow and adoption/innovation has been hampered by poor connectivity.

Cheap connectivity could be incredible. Current sat systems are expensive and coverage is interesting.

IoT and autonomy offer the potential cost saving measures to get the investment from some of the most frugal operators.


How so?

I mean, besides providing better connectivity for people on ships, what will/can change? What was blocked in building out by current satellite offerings?


On cruise lines at least, internet is extremely expensive and slow. Like $80/gb. Starlink would be awesome (Assuming it will be safe to go on cruises in the future...).


You could live on cruise ships 11 months out of the year...


> Assuming it will be safe to go on cruises in the future...

Why wouldn’t it be?


New (covid) world order


Same COVID obsession that leads to people wearing masks during their wedding ceremony, etc.


There’s a surprising amount of real time route optimization that in theory could improve efficiency and cost by a few percent by taking current and weather into account.


Weather routing is already prevalent over existing satphone connections. The GFS model that is typically used for this purpose has low enough resolution that there's marginal returns to additional bandwidth or reduced latency.


You can interpolate the GFS very accurately in terms of wind over land and close to land with WRF-ARW. There is highish resolution land use data and good elevation models. This is something you can do on your laptop even these days.

However I have no idea of good it is at wave and currents.


You can also put the compute and the data on land, and do routing as a service. The decision bandwidth (i.e., ideal heading/route) is miniscule.


You can also get real time data via ham radio although it’s extremely slow. It still works for navigation adjustments and optimizations. I fail to see how this is relevant beyond crew morale.


Well, routing seems like something that needs just ship position. You can easily send that over, for example, iridium network and as long as you won't try doing that 10 times per second (because why?) if savings were actually that big, it could already be done, even many years ago


Most vessels will have some form of connectivity on board right now. It depends on their vicinity to coast.

Right now data sizes of an email are industry standard. As you go into 100s of Mb you start to see people really struggle.

I would look at BIM systems for a parallel with ships. Ships are incredibly complicated and often self reliant. Shore based monitoring is hated by the crew but vessel operators see return from it.

A simple example would be an extension of the IoT devices already on board probably around engines. Anything with tangible benefits to Efficiency and safety would see investment.


Tracking the status of your cargo. At the moment you always have uncertainity during the voyage - you don't know if e.g. temperature was held consistently until you took control of the cargo after arrival, or if it was tampered with during transit in any way.

For fleet owners, no matter if for ships, cars/trucks or planes, management becomes way easier if they have global high bandwidth data uplink: you can detect stuff like engine or other part wear early by running big data analysis on telemetry, and in disaster case you have way better logs to operate on (both post mortem and during troubleshooting).

This by the way also is relevant for car manufacturers - with the exception of Tesla, the only way classic car manufacturers get information on how their cars are used in the real world comes from test drivers, accident reports and shop visits. Consistent data uplinks allow for a lot more telemetry that can be used to improve their products.

Surprisingly, even scientists can benefit from this - have ships transmit real-time data about air pressure, sunlight, wave height etc. and suddenly you have a global network of floating data buoys.


> temperature was held consistently until you took control of the cargo after arrival, or if it was tampered with during transit in any way.

There are already temperature and tamper devices for sensitive freight. There are devices that will show if orientation has changed or if shock occurred etc.

Sensitive freight is an entire industry that has already innovated around the lack of connectivity.

That's not to say there isn't further innovation to be made.


> There are already temperature and tamper devices for sensitive freight. There are devices that will show if orientation has changed or if shock occurred etc.

Tiltguard stickers or how were they called... about the most useless bullshit I've seen. Back when I worked in stage lighting, many cases had these affixed... and you can guess what every shift leader had in their pocket: yep, a load of replacement ones.


Consistent data uplinks allow for a lot more telemetry that can be used to improve their products.

Shipping is one thing -- the company still owns the goods. But keep this out of my cars.


I’m curious, why?


Because that data can be seized by cops or mined by secret services.


Ships already have satellite connections, and GPS, and internet, and all of that. Starlink wouldn't revolutionize anything.

At best, it might be cheaper than what is currently on the market, but that expense is largely driven by durability requirements for equipment exposed to heavy amounts of moisture and inclement weather. Durable products are not something that any of Musk's companies are known for (especially Tesla), so they enter this market with the disadvantage of that reputation.


> Durable products are not something that any of Musk's companies are known for

Starlink is not Tesla, and from my study of the /r/starlink subreddit, there have not been any significant complaints about the dish reliability, even in the bitter snow and cold


That's precisely the argument I heard from Nokia die-hards when they iPhone was introduced :-)


This isn't a Nokia vs iPhone situation.

Shipping already does everything that Starlink proponents claim Starlink will enable, i.e., track ships, track cargo, track temperatures in containers. And this data isn't like video, so you don't need a lot of bandwidth for it.

Starlink might make things cheaper than current satellite communications offerings, but this is evolutionary not revolutionary.


In another thread, someone suggested Viasat was costing between $20k and $50k per month.

On a ocean going cargo ship, that is not that much relative to the other costs.


And my blackberry could already run apps, browse the web, and send quick messages to each other years before the iphone.


To be fair, your iphone doesn't really do anything interesting past what your blackberry did. It's mostly network effects, which used to exist for blackberries and now exist for smartphones. What they're used for by most people is almost indistinguishable from what they used to be used for.


Not the same apps, though. In this comparison, it would be "my blackberry already runs iOS apps, but they're a bit slower than the new iphone".

Improvement for sure, and I'd bet on StarLink making a lot of money on this by volume, but it's not going to radically change shipping.


Funny, I've just gone back to a Nokia.


No wireless. Less space than a nomad. Lame.


The OS are making a lot more data freely available as part of the Geospatial commission push to make geospatial data more widely available to drive innovation. I wouldn't be surprised if this is all a slow response to external activities like OSM and Google


I remember reading that when all OS data cost money half of their income came from other government departments, so once you add dealing with all that licensing there's not much extra cost to just fund it out of general revenue.


It’s also in response to the urban planning difficulty of knowing who owns what or what’s under this spot.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: