Build Application without merging 4D Server

Most of the times I use the Build Application command is just to create a new version of my structure files without changing 4D Version of the server.
But the command forces me to create a new bundle including Server, Plug-Ins, and Components that takes a lot of unnecessary time.
How about just creating .4DC, .4DIndy and Resources files without wasting time in creating the rest of the files?
In fact the only difference between a non-server application .4DC file and a server application .4DC file must be a single encrypted bit somewhere within the file when it would be easier just to be able to create the new version of compiled files independently of what we’ll do with them, either for using them with a common 4D Server or for merging them into a Server Application bundle.

On Mac merging also signs the package, huge difference.

Merging also checks if the version used fits to the version selected for the build and makes sure that there is no version mismatch (often happening error).
It is not really time consuming - what does it take, one minute?
But it produces a lot of additional security AND allows automatic update of Client and Server, a well spend minute.

Think about to automate your process. After using Build Application, if OK=1, export the structure as Project/Text and commit it to Git/etc, to get a version history and a way to track changes between builds. In that whole process the one minute building time is only a small part…

It takes much more than a minute in my case.
Besides, as my updating process when not changing 4D version is just taking .4DC, .4DIndy, Resources and Web Folder (not included in build app so I must handle it separately), if I don’t make things properly I’ll get a version mismatch error when trying to launch 4D server application.
I don’t need to deal with automatic Update of Client and Server when I’m just updating my structure.
When I want to update Server’s 4D version I just replace the complete folder I’ve created the first time I built an application with that version except for Server Database folder. I can’t replace that folder because local preferences are stored within and I would destroy them if I replace it with the new version. And then I replace structure files separately, because I must put there the last version I have got and not the one I had created when building it for the first time.
And the same thing happens when I must upgrade a Mac 4D Server, just having the extra work of showing package contents to replace what I need that is stored within.

yes, as I tried to explain, you run through a lot of trouble if you try to do it differently.
Expand package. avoid replacing files not supposed to be replaced. Rescuing files. Trying to avoid mismatch. And so on. Endless story.
And so hard to do. You compile today at 16:00 while 50 users are still connected. You need to stay in the office till you are the last - or try to arrive tomorrow morning as first.

But if you would use the Build Application process as it is designed - you just (automatically?) need to copy the result on your server. If you work inhouse, just with COPY DOCUMENT. If you are software editor, zip it, upload to FTP (and opposite on customer computer), all automatic.
The server checks once per night if there is an update available, if yes, it starts the automatic update process (another feature, but requires that Build Application is used).

As developer you just launch a method which compiles, builds, upload to zip, upload to server. That runs in the background, you can do something else, all is automatic.
Next morning the server has updated itself and restarted itself. If 4D’s version number has changed, clients will receive their update, all automatic.


Your advise is very good applying it to a single 4D Server installation with just production, (maybe testing) and development databases. That’s the way 4D has designed this Server Application/Client Application bundle.

But when having about 100 4D servers running the same application in different locations not all of them with a good Internet connection isn’t so good to allow a complete 4D Server update (about 287 Megabytes .rar compressed file) on all of them each time I need to change a version just for having added a field to a table or just fixed a bug.

Instead of that I’m taking advantage of RESTART 4D command to automatically update all the servers with just a command executed in a central application. Each 4D Server when it’s time to be updated downloads by FTP the new version (just 40Mb compressed file with a structure than, in .4DB, is more than 300M big), un-compresses it into the target Server Database folder located at an exact copy of Server folder and then executes RESTART 4D and the update is done. It would be even greater if 4D allowed a RESTART 4D with an update folder that just contains .4DC, .4DIndy, Resources and Web Folders. But I know the answer, it won’t be so safe because the new structure files may not be compatible with 4D Server version, etc., etc.

I hope 4D sometime would develop its excellent development tools thinking of 1 app – Many servers installations like many of us, software deployment developers, are used to work.


1 Like


First of all, I think understand your request.

  1. When you are saying “not all of them with a good Internet connection”… Do you mean if you get 100 “servers” trying to get 287 Mb from your ftp server at the same time, it is not efficient ? Or are you saying that individually, for some servers it is too long to download 287 Mb ?
    I did help an OEM to use AWS S3 to distribute its updates. This way they don’t have problems with distributing large files simultaneously (AWS S3 can handle this).

  2. Maybe git or rsync could be used to optimize the distribution of the “diffed” built application versions ? Haven’t tried it for applications. I know it is not the intent of git or rsync or its original purpose, but it might work. This could be a headache to sort out when there his a problem as Thomas explained (with signed binaries). So simplest solution is to do a full build.

PS : I also have a DSL internet access (not fiber) upload is 100 KB/s, download is 1 MB/s (45 mins for 300 Mb) . So sending large files feels like watching paint drying. Can’t wait for fiber.

Thank Bruno for understanding.
I mean that individually for some servers it’s too long to download 287M.
My updating method leaves a gap of 10 minutes between different servers that run on the same machine because some times two of them try to execute SET UPDATE FOLDER and RESTART 4D almost at the same time and that causes a runtime error.
For the moment not all the servers are updated automatically because some of them run as services and at least in version 16Rx RESTART 4D doesn’t work fine on services 100% of the times.
However I do mass updates of about 50/60 servers running in more than 10 different computers at different locations and our FTP site never gets collapsed. I do that usually on weekends starting early in the mornings. But it’s a 40M transfer for each one, in a couple of hours they are all done. I don’t know what would happen with a 287M transfer.