Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why’s perforce not replaceable by git?


Perforce has a locking mechanism for files, which is important for the large binary objects that are used in game development. Git doesn't really have that concept, so it requires more out of band communication to make it work for that use case.

Git LFS might handle this, but it's just not something that Perforce shops have to even think about, train their people on, etc. It's just "there" and there's somebody that they can call and yell at if things go wrong.


> Git doesn't really have that concept

Of locking files? Sure it does; if two people modify the same piece of code, then there will be a merge conflict, and whoever's doing the merge will have the opportunity to reconcile those two versions. It's like a lock, but you don't have to think about it / explicitly lock and unlock the object you're editing. Combine this with rebasing (git-speak for "switch to the branch into which I want to merge and then try applying all the changes from my branch") and you end up with a pretty coherent idea of what changed and in what order.

Having used both branch-oriented (like Git) and locking-oriented (not Perforce specifically, but my current dayjob involves very similar version control schemes), I vastly prefer the former. Less ceremony around doing things in parallel, and less risk of development deadlocks (e.g. Alice locked A.cpp for editing, Bob locked B.dds for editing, Alice now wants to edit B.dds, Bob now wants to edit A.cpp, now Alice will have to revert her lock on A.cpp, wait for Bob to do everything he needs to do, then relock A.cpp and redo her changes all over again (probably also having to debug new things because of new bits and pieces Bob added) - or Bob will have to do so for B.dds - and cue fistfight in the breakroom).

----

That all being said, binary files are indeed a weak spot for Git, so if you really do want to version control your textures and models and sounds and such, then sure, perhaps a different VCS tailored for that use-case would be more ideal. In my experience, though, that tends to be an anti-pattern; generally better to keep code and assets separate (I'm well aware that tools/engines like e.g. Unity don't play the slightest bit nicely with that separation), using separate version control systems tailored to the specific content.


Merge conflicts are definitely not the same thing as file locking. At the point where you're getting a merge conflict, it's too late. How are you going to resolve conflicts in an audio file?

File locking is a way to communicate in advance that nobody should attempt to make changes to a particular file until the lock is cleared. Merge conflicts are communication in arrears (so to speak) and at that point, effort has already been wasted by at least one person.


> How are you going to resolve conflicts in an audio file?

By choosing one and discarding the other, or by creating a new audio file that incorporates both changes, just like one would do for any other file.

While I don't know if this exists, there's nothing theoretically preventing the creation of a diffing/patching tool for audio/video files, detecting insertions/deletions/replacements (perhaps by timestamp/frame rather than by line) the same way `diff` does for text.

> at that point, effort has already been wasted by at least one person.

A file lock doesn't prevent that time waste; there's nothing stopping someone from immediately overwriting it as soon as the first editor commits/releases the lock (in which case the first edit was a complete waste of time).


I don't quite think you understand how perforce works.

Locking the file is a signal not to work on it because someone else is modifying it. There isn't currently software that supports semantic merging of audio or video, so perforce says "someone is recoloring the asset, don't change the animation until they're done, or you'll just have to do the work again".

In the meantime, perforce prevents you from checking the file out. So you have to explicitly and clearly bypass your source control to do what you suggest, at which point someone rightfully yells at you.


> Locking the file is a signal not to work on it because someone else is modifying it

I know how file locking works. Like I've said before, my day job involves using a locking-based version control system. My disdain for locking-based version control comes specifically from having to put up with it instead of using something sane like Git.

> So you have to explicitly and clearly bypass your source control to do what you suggest, at which point someone rightfully yells at you.

And yet that will inevitably happen anyway, because someone's breathing down your neck to get that animation done and won't take "well I'm waiting for Bob to finish recoloring first" as an answer. Cue the aforementioned fistfight over whether Alice's animation or Bob's recoloring can happen first.

Or it'll inevitably happen due to the aforementioned risk of deadlock (Alice is tweaking animations in a different order than Bob is tweaking textures, and thus run into a situation where both are waiting on the other to release locks on files; cue fistfight).

I've seen both cases happen repeatedly. All of those cases would've been avoided had we used a version control system that used branches and merges instead of locks and unlocks.


> because someone's breathing down your neck to get that animation done and won't take "well I'm waiting for Bob to finish recoloring first" as an answer

As it happens, that's exactly the answer you're supposed to give. It's not like game devs have any shortage of things to work on at any particular time.

> All of those cases would've been avoided had we used a version control system that used branches and merges instead of locks and unlocks.

I don't think you're thinking this through all the way. If I have 5 different branches where textureA.png got modified, which one is the right one to end up with after everything is merged? Who is going to have to re-do work? Branches do not solve that problem, and without out of band communication, that problem is made significantly worse than if those changes were done in sequence (which is what file locking enforces).


> I've seen both cases happen repeatedly. All of those cases would've been avoided had we used a version control system that used branches and merges instead of locks and unlocks.

Well no, the failure modes would be different, but all of those things would still cause problems with branching and merging, mostly because you can't actually merge binary artifacts, only rebase.

And in a world where you can only rebase, locking actually makes sense.


> all of those things would still cause problems with branching and merging

The deadlock most certainly would not. One would get rebased on top of the other, in the worst case.

(In the best case, Alice and Bob would both be committing/pushing to their respective branches early and often, and could therefore look at each others' branches and see what changed between them; if you really want to emulate locking, you could do so pretty straightforwardly by checking if there are any changes to that file in commits outside of your own branch's commit history, and it should be doable to write automated tooling to that effect. All the "benefits" of locking without getting in the way of anything.)

> And in a world where you can only rebase, locking actually makes sense.

Even in a world where you can only rebase, locking is far more of a hindrance than a benefit, for the reasons already outlined.


A lock prevents you from opening it to edit in the first place, without warnings that "so and so user is currently editing this, are you sure you want to open this as read-only?". This reduces accidental time waste (and computer systems shouldn't try to protect people from deliberately wasting their own time, because they probably aren't).


Sometimes it is desirable to not have optimistic locking like git. Often you want to signal that you don't want a file/asset modified for a period.


I think we're gonna have to agree to disagree on that. I've found that locking a file from any changes entirely almost always causes more problems than it ostensibly solves, and is a massive impediment to productivity on any team larger than a single person. The vast majority of the time it's entirely unnecessary anyway.


Absolutely agree with you when it comes to textual/structural information where merge conflicts can be solved easily. But for binary like formats, I feel often there is no way to resolve a conflict.


Yes, but you probably don't make a habit of storing music or video (or weird proprietary binary animation formats) that people are modifying in git.


Indeed I do not. If game developers (and/or game engine developers; looking at you, Unity3D...) showed similar restraint, then they likely wouldn't feel the need to tie themselves to 90's-esque version control schemes.


The game industry has a completely different way of working compared to other tech fields. Most of their practices seem old fashioned, or anti-patterns, etc. if you look at them from an outside point of view.

But they're trying to do something very hard: control both quality and deadline. It's not like an open source or enterprise IT project. The game has to be finished, good, by a given date. To do that they do many things, including slaughtering sacred cows about how things are supposed to be done. If hard locking saves them time by avoiding wasted efforts, by artists who are probably over-worked already, then what's the problem?


> But they're trying to do something very hard: control both quality and deadline.

Do you believe other tech fields are not trying to do the same exact thing? Do you believe that enterprise IT projects are somehow less beholden to both quality and deadlines than video games?

Controlling both quality and deadlines is what literally every (professional) software development project should be doing. The implication that game developers are somehow better about or more cognizant of this (let alone specifically because of broken development workflows) is pretty rich considering how frequently big-budget video games end up in development hell (see also: Duke Nukem Forever) and/or end up riddled with bugs (see also: literally everything Bethesda's released in the last decade or so).

(Not saying that enterprise IT is necessarily better in either regard, of course; bugs and delays are inevitable as software complexity grows, be it an RPG or an ERP...)

> If hard locking saves them time

It doesn't save time in aggregate, though. It maybe saves time for one specific corner case (conflicting edits on binary files) at the expense of slowing down everything else.

And if you're at the point where "conflicting edits on binary files" is no longer a corner case, then that betrays at least one of two dysfunctions:

- You're using some sort of container format instead of sanely keeping things split into separate texture/model/animation/normal/behavior/etc. files (and then - if the amalgamated container file is indeed necessary - combining it as part of the build process). If you're really running into a lot of cases where multiple people are trying to edit the exact same normal map at the exact same time, then...

- You've got a communications breakdown somewhere. You can band-aid over that with file locking, but that still doesn't address the issue of two people trying to independently edit something in the first place.


You're not really going to get away from large binary files in game development. They need to be version controlled somehow. Adding more layers of indirection for how that actually happens is just kicking merge vs file locking can down the road.


Additionally, you have artists and level designers working on these files. These people tend to be semi-technical, and confronting them with the full details of git is usually met with a lot of resistance.

I know some smaller studios that use git to some extent, and they tend to struggle with it on occasion - again, primarily in the art & design departments, mainly with binary files. All the larger studios and some of the smaller ones seem to use perforce. Publisher-owned multinationals might have their own proprietary systems, I don’t have any insider information there.

In any case, people who aren’t familiar with the requirements of gamedev like to argue about this. It’s like proponents of forks arguing with someone trying to eat soup with a spoon.


> You're not really going to get away from large binary files in game development. They need to be version controlled somehow.

Yes, but if they really can't be split up into smaller components, then there's always the option of using a different repository and VCS specifically for binary assets. Yes, it's kicking the can down the road, but at least it's kicking that can away from the actual code.


From what I've seen, perforce is better for storing binary intermediates.

Git doesn't work very well with partial or subfolder checkouts which is something you would need if you want to store intermediates. In games intermediates can take multiple hours to build (like maps). Building the game from scratch every commit like it's usually done with git CI tooling does not work very well.

We also ran into wanting to give modders access to maps and materials but not to game code.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: