it's short, phonetic, includes "words" (plural), homonyms for "word scene", literally describes the collection of each word seen in public scenery, etc.
I don't know that that charactization is accurate.
At a minimum you needed to implement an IList whos indexer creates the actual data when called. You would then set that IList on an ItemSource.
If you could not provide data synchronously you could return an observable object and raised property changed events when the data was available
I don't know if there were constraints that that unworkable for your scenario but my memory was data virtualization not being that much more code than WinForms.
Visual Studio introduced an interface that formalized fetching a page at a time so you did less bookkeeping when mapping indices to items for SQL like scenarios but I don't know if controls had better performance using that interface
This seems like it would work, but it actually didn't. One issue was that WPF uses rows with dynamic row heights instead of fixed row heights like GDI or WinForms. This forced it calculate everything it needed for the layout of each row to compute the total height, scroll position, etc...
Like I said, this got changed when Visual Studio was rewritten to use WPF, and it "works now", but the default behaviour is still the slow path, not the fast path.
It famously wasn't intended for authentication, which is why OIDC was developed on top of it. Trying to run a delegated authorization protocol as an authentication protocol caused vulnerabilities.
There are obvious authz uses for the password grant: you use it when you want to delegate access to a client running on your desktop, which is in your custody, and there's no point in running a multi-legged authorization protocol because you can just log the client in yourself. Your first thought about that might be "that's authentication", but it's not: you don't have to give all-or-nothing access (in theory) to such a client.
OAuth 2.0 password grant can be (mis)used for authentication the same way that LDAP Bind is used for authentication. That doesn't make either of them an authentication protocol.
I've been seeing more and more apps and websites have me login with my xyz app login info. Then, ask if it's ok to grant access to xyz app so it can access xyz app information. Makes sense only if you are a developer. You are granting the frontend access to the backend via oauth. Completely confusing for everyone else and was pretty confusing for me at first as well.
If you mean, like, logging into things with Google, sure, but isn't that technically OIDC? If you mean to say "most OAuth is used for OIDC, and is thus authentication", that's a different and less interesting claim. If instead you're saying that vanilla OAuth is primarily used for authentication, you're saying something more interesting (and problematic). You can use vanilla OAuth to log in, but you're adding a particularly subtle class of possible flaws in your design by doing so.
I mean the former. The primary use case of oauth on the modern web is to support openid connect. So much so that I expect it'll be a "SSL vs TLS" thing in the future where we actually use "oauth" to refer to the entire openid connect flow.
Sure, OK. But this article really thinking about OAuth authentication in terms of OAuth itself, not OIDC. The dominant use of TCP on the Internet is (I hope?) to fetch URLs, but HTTP is not TCP. :)
What you're describing is a professional licensed engineer who has to among other things take out insurance because they are ultimately responsible for their decisions.
Software development is long overdue for a legally professional software engineer license and certification process. I imagine it's going to happen down the road after something important gets burnt.
Every time this comes up, people (some of whom are in this thread) end up talking about how this can't/shouldn't happen for software. After all, what, is every high-schooler or green college grad that ever wants to code their own app for a startup going to have to be professional certification?
I guess I'd argue that those people shouldn't be legally allowed near this kind of thing without that kind of a certification. Looking into all of the other engineering disciplines, that's exactly the kind of thing you see. I have a BSME, but I haven't taken the Fundamentals of Engineering exam to get my FE cert, in part because getting a PE certification requires working underneath a licensed PE for a certain number of years, which isn't the case for my current job.
I also know that by not doing so, there are certain projects that I simply can't work on. I have to imagine that there's a way to create a legally enforceable framework that falls into the same category for software engineers. Want to build a company that creates a digitally-synced notepad? Have at. Want to touch personally-identifiable medical data? Better have a licensed engineer working on that project to sign off, else your company is wide-open to liability claims with teeth. If something unreasonable gets by the signed-off engineer, they're on the hook too.
Obviously, it's a complicated problem, and reducing things to a first-order solution rarely is a catch-all, but there has to be some more professional/personal responsibility taken by the individuals building these systems, and a requirement of licensure is a way of empowering engineers in those positions to the point where it actually matters.
> After all, what, is every high-schooler or green college grad that ever wants to code their own app for a startup going to have to be professional certification?
You answer your own question fairly well, but I'd add the observation that in licensed engineering domains, we don't always require licensed engineers. We have a licensing regime for structural engineers, but we don't require them for minor structures like gazebos or doghouses.
We could have licensed Software Engineers, but only require licensed oversight for software dealing with human lives (avionics, medical devices), PII, elections, and a few other critical cases.
I developed software for medical devices and you have to do a risk analysis, formalize the software development process, declare qualifications of people, make it revision proof, have a formal testing process, ... everything is already accounted for.
Notified bodies ensure compliance. They have the problem that they cannot really evaluate the work of software engineers of course. Not even another software engineer could do that within feasible time limits. No software engineer can make sure there aren't exploits that could endanger user data. You can at most test if due diligence was ensured.
The manufacturer is responsible for ensuring safe operations of devices and yes, that includes keeping personal data safe.
But again, the problem wasn't the engineer at all, the problem is the wish for amassing data like this. Paper license or not, it rarely ensures competency and wouldn't have solved this problem.
Aside from legislative issues that ensures that user data belongs to the user the data is about, ensuring that companies don't sell and share medical data with "friends and family", ... this is probably the last step, if it is even required at all, which I would dispute. There are no guarantees if you amass data like it was done here.
Make it prohibitively expensive to leak data (compliance fines, lawsuits) and the problem will solve itself. Companies that collect data will then be begging for certification and regulation.
It would be even better if people learned to refuse to give data irrelevant to the service that they are seeking and/or if there was some sort of regulation about this (I should not have to give my name and address when returning a product for example).
I've worked in critical infrastructure work and retain an interest in the field. A professional software engineering license should be legally required for certain classes of risk.
Ultimately the answer to an irresponsible feature ask should be:
> I will not approve implementing this feature, because in my professional training and experience, the risk exceeds the acceptable tolerances for a (spaceflight, medical, power systems) delivery. If I implement this and a failure occurs, I will be in court and my career over: I refuse.
We could run a 2 tier system. Anyone can build apps but to build apps handling sensitive PII you need to be registered. Execs need to be liable for this to work.
Guess I should go to jail for daring to make a script for me and my friends that uploads screenshots to my server without having a professional certification, huh? Same for releasing open source software.
While I agree it's overdue, I'm not sure what that event would look like.
When a bridge falls down, building collapses, patients die, people take notice.
Meanwhile, we've had the social insurance numbers and banking history of 165M+ UK, US, and Canadian people leak out of sheer technical negligence [0], and it resulted in a meek settlement and hardly broke through the public consciousness.
It seems to me that until someone dies in a way that is very clearly linked directly to a woefully negligent and under-trained software engineer messing up in a very public way, the needle is not going to budge at all.
Perhaps autonomous cars? Even then I doubt it, to be honest.
This reminded me of Therac-25 [0] wherein people died or were seriously harmed by software malfunctions in a radiation therapy machine, and standards were introduced [1] to help mitigate future incidents.
I would guess it's about 20 years out, honestly. The sequential and interconnected weight & cost of software failure will have to be well beyond the capacity of conventional errors & omissions insurance to tolerate; settlements will have to be business-crashing - Knight Capital level crashing - to drive this.
Today, it's well under that.
It is, of course, long overdue; Therac25 should have really gotten the effort going, but, ce la vie with a irresponsible economy. It's plausible the EU legal systems will develop this effort first. I would not be surprised to see France or Germany fully develop the idea, probably in connection with Airbus or Siemens. Anyway. Idealism around quality....
If I remember history correctly, there wasn't one boiler explosion that caused us to start regulating who was allowed to design boilers. It's just that as boilers became more and more popular at the end of the 19th century, people started being maimed and killed more often, and at some point we just as a society decided enough was enough.
A huge number of security vulnerabilities are due to production mis-configuration rather than flaws in the development of the actual software. Software Engineers generally aren't trained in managing these production environments so you would probably want at least two different legal licenses to cover your bases.
The two are not orthogonal. I used to do some work in financial services and some of the systems were designed not to run risks in case of misconfiguration.
There's no chance of this happening in my view. There's way too many vested interests in the "democratisation of development" especially the likes of "coding bootcamps", limitless frameworks and tools for "code-free" development, and extremely blurred lines on things like WordPress where a "developer" could be anything from someone who can only install plugins through the front-end to crack PHP developers.
I think there's more to it than this. In traditional engineering, you only need to be licensed to sign off.
You can still effectively do exactly the same job as a licensed engineer without being licensed, you just can't sign off on work (And thus be held responsible).
The way I see it, creating a license for Software Engineers doesn't really fix the issue. It just creates a scapegoat to blame when things go wrong.
> It just creates a scapegoat to blame when things go wrong.
That's just how it works for other engineering disciplines; with the license, the scapegoat has the legal power to refuse responsibility until they're confident things won't go wrong.
Who is ever really going to be confident in signing off on complicated software systems when in effect, it's saying, "Yes this system will absolutely not break/be breached/have something happen to it". Given the complexity of modern software, it doesn't seem like a feasible solution.
Even top software companies have things break or go wrong very frequently compared to traditional engineering.
I don't see this as a good solution in the software world.
I don't think it would demand that the software be infallible. If a "new type of earthquake" happens, the Civil engineers won't be responsible for their bridges failing. In the same way, if there is reasonable effort made to ensure the system is secure by standards of the time, I don't think these theoretical individuals would be blamed for particularly novel (zero day) or unpreventable (social engineering, to at least some extent) attacks.
I learned that protected data is probably safe for a finite timescale with all currently known forms of encryption under the assumption there isn't technological advancement that severely restricts the first assessment.
Then you say "no". Business "needs" are important for replaceable serfs with no power to say otherwise; when the business needs engineering sign-off, you tell them what they needs to do.
You are willing to pay the costs of "professionalizing" software development and installing expensive gatekeepers, certifications, signoffs, and processes at every step, right?
Most of that is already in play already after about 30 people are in a group, we as an industry haven't formalized it, and there's no legal teeth around it.
Equifax also comes to mind... amongst others, but you know.
I don't see anything changing as long as it's in money's interest to stay where it is (money rules this country point blank). Right now it's in too many business's interest not to change. There are also slews of developers earning quite a bit who might be forced into career changes depending on how licensing could be implemented.
Yep. And how is this not a standard errors and omissions insurance policy that tech agencies like mine should (are required) be carrying?
We spend about 15k per year for about 10mm in coverage (we are a small shop). This sort of oversight is not just an engineering one but it’s fundamental to the core ops of the business. If we are rushing under client pressure (or just running late) to the extent we take on risk to trigger liability, it’s full stop.
https://cense.ai/about
Guy number two here had the background to know better. Just didn’t make it a priority. It’s unfortunate. The only way things change is when we start seeing data warehousing/collection as a liability (not an asset) and manage it accordingly. And making the penalty for error unforgivable.
Working as a contractor in the UK, most contracts stipulate you have Professional Indemnity insurance, which is for exactly this. I doubt anyone has actually made a claim for it though, the amount of other people I've worked with (who are also contractors earning £500+/day) who I wouldn't trust to even watch my laptop is atrocious.
AFAIK, this only really covers gross negligence or outright sabotage. I don't recon anyone trying to pin bad business decisions on their devs are likely to succeed in court.
When I have done independent IT ops and software work I have carried professional liability insurance, and I would recommend that everyone does.
The irony is, of course, that PL/E&O insurance for software work runs pretty cheap, presumably because liability on the part of developers of software is quite rare!
In Australia we're all insured. The only developers who aren't are employees for the companies they're writing code for. The worst thing that can happen to them is they'd be fired. The company still owns the blame.
When deeply accustomed to privilege, attempts to restore merit often present as penalty.
Removing legacy preference in favor of merit is not penalization.
The article does not advocate admission based on racial preference.
It does point out that a level playing field allows more applications of merit from races that would traditionally be displaced.
There's a lot going on and it's really hard to keep abreast of it all
https://www.reuters.com/legal/trump-says-law-firms-agree-pro...