"The COTS industry is under an anti-commercial attack". Oh no, not the COTS industry!
> In my experience, the most technologically savvy F100's are all VERY familiar with utilizing in-house dev, and as recent releases from AWS/Azure might suggest (SQL Server on linux, Aurora support for postgres/mysql, etc) OSS is a key part of this as well.
Ding ding ding fries are done! Oracle is deeply concerned with losing customers to PaaS and Saas providers. Part of Oracle's current model is to push its mixed and private cloud offerings for their customers--AWS and Azure have a huge share of that market already.
It's Oracle, though, and they have a huge presence in the gov't sector... They're trying to reclaim that as much as possible by moving in on the Trump Administration's transparent give-aways to favored parties.
1. I work in a backwards enterprise shop where "no one ever got fired for choosing Microsoft" and it would be easier (read as "possible) to get IT here to support it than Postgres. (That said, they do support Mongo internally but won't admit that to software teams)
2. It's much cheaper than Oracle, which is the other "enterprise" database where it's easier to get an install done
Oracle is a fine enough database (it really is), but it costs an exceptional amount of money and (I haven't touched it in a long time so I may be out of date on this, but I don't think I am) the developer tool experience is not great. Oracle SQL Developer is ugly and slow compared to SQL Server Management Studio. Basically, if you've used both Oracle and SQL Server you can tell which company had their CEO on stage shouting "developers developers developers" and which is run by the lawnmower.
There is another argument for or against adoption of Oracle and that is a basic business argument. Oracle is expensive and their virtualization costs are well above Microsoft.
Not the OP, but licensing comes to mind. MSSQL licensing and Windows Server licensing are a pain, especially in virtualized environments. But Oracle? The word predatory - complete with claws and fangs - comes to mind.
The thing I find crazy is that in 2017 $20M movies with B-list names are getting picked up as "independent". Ridley Scott's name shows up on IFC movies now; It Follows had an excellent ROI compared to most "blockbuster"; Get Out was a sold-out hit with audiences and critics... but it's all just a blip compared to the latest Transformers or DC or Marvel franchise shit-show.
A few millenia from now archaelogists and historians will be pontificating about Hollywood the way we do about the Pyramids.
I heard some scuttlebutt about two years ago after Oracle got rid of some Java people that Oracle had wanted to leverage Java-as-a-Service as a cloud product, sort of like Google Compute, but just never managed to get there... The Sun acquisition, someone told me, "was a hardware acquisition". With SPARC/Solaris dead, Sun staff gone and the lawsuits against Google over Java faltering Oracle has sucked the last marrow out of the bones and is just casting them off now.
When i read about the "hardware integration" of OSX/macOS/whatever I really have to wonder.
Power 32 bit is dead. Apple never ventured into other architectures. macOS (and OSX 10.whatever plus since the mid 2000s) are running on x86 and stumbling towards ARM; iOS runs on ARM. It's still commodity hardware, just like Windows and Linux.
I understand that Apple supports a much smaller set of hardware and drivers, but in my mind it's not really 'integration' the way, say, dinosaurs like SPARC + Solaris or POWER virtualization under AIX is integrated hardware with software. I'm really just being cranky here and splitting hairs.
This isn't HW integration at the CPU level, but the whole damned device chipset. Apple know what network, WiFi, video, audio, network, camera, USB, FireWire, or whatever else device(s) are going to be on the system. They can buy out entire production runs of hardware if need be. And if there are changes to spec, they can stay on top of that as well.
Linux ... takes what you throw at it. It does this pretty damned well most of the time, but I've lived and seen the challenges.
Bourgeois law in its wisdom protects individual property owners from their mistakes, unless they affect owners of larger property... that said, Equifax may have just fucked over the entire retail/consumer credit industry in the US, so the hammer may very well fall on them. We might get some of those banker perp-walks everyone loves.
But with the current administration as well as Congress (we're likely to see a Federal gov't shutdown over the budget even though the Republicans control the legislature and White House), I wouldn't anticipate seeing any regulation down the pike because of this.
They mention the Struts vuln, but not which one... did an attacker access the info directly via a naive attack, or was this a campaign? Having worked on Enterprise-Ready(tm) systems I wouldn't be surprised if Equifax had an unsegmented network...