4d log file to json and object fields

Hi, it looks like 4d log file to json command doesn’t export object (json fields), I am getting “objectField”:“BlobID: {number}”.

Is there any way/tool that can be used programmatically to either parse journal file and see object field content or export journal file to json/xml/txt with object fields ?

Appreciate your help,

Andrei

1 Like

Andrei,

You could post this to the Feature Request forum as a feature request.

Regards,

Wayne

Andrei,
My guess is that the field in question is marked to store “in the data file”, rather than “In the record”.

When this is the case: then 4D stores the field (whether object field, blob field, text field, etc) in a separate operation in the journal file: and only if the FIELD is updated. So, if the record is updated, but the field is not: then the field does NOT write an entry in the Journal file.

At conversion time of the Journal to a JSON file: it’s not possible to know what was in such object fields, because it is just NOT added to the Journal file unless the field was also updated.

The best way for you to adjust this is just change the field configuration to “store in record”.
Regards, Tony

Cool, thanks posted

Hi Tony, is this a theoretical ? Or have you tested this ? I ask because I’ve tried both variants (in data file) and (in record) and both just reference a blob (maybe I am doing something wrong ?, I’ve only tried when inserting a new record, and found this issue when looking back at json fields which I know were changed, and still they were not in JSON export).

The issue form my understanding, is Object fields are treated as blobs. And it’s not logical to export arbitrary blobs as text (where field type is actually a blob). But for Object fields, i would think, it should be easy to export the blob as txt/json from 4d end.

Objects can not always be represented in text.

For instance, there can be a circular reference that would go on forever if “stringified”.

4D Write Pro document is a binary object that can only stringify basic properties of the document.

Your application may only contains plain JSON, but an object field is capable of much more.

@Keisuke_Miyako, true it’s possible to have object with circular references. My message talks about object fields, not just objects, so I think my comment still holds. If not, can you suggest how an object fields can store an object with circular reference ?
In 4d documentation: Object: attribute/value pairs in JSON notation., here is a link to what JSON types are: https://cswr.github.io/JsonSchema/spec/basic_types/

Just assign such an object to a field and save.

You can do the same with VARIABLE TO BLOB, or pass such an object to another process, even between client and server.

This was an enhancement in v17, as explained in Laurent Esnault’s Summit presentation from 2018.

Thanks @Keisuke_Miyako, I didn’t actually know that was possible in 4D object fields, I should have checked before commenting. (I thought it would be similar to other relational databases).

Would it be possible, if ‘4d log file to json’ was able to export object fields to either:

  • Identify circular references (this might be hard in some cases) -> in this case don’t export object field
    or
  • Specify max ‘depth’ of object field
    or
  • Or specify max ‘characters’ to export -> if too many, then don’t export object field

I can not possibly answer such questions, that is way above my pay grade!

What is your expectation for LOG FILE TO JSON?

In your original post, you repeated the word export several times, is that your goal, to export 4D records?

According to the documentation, one suggested use for the command is for the “administrator or any user to analyse database events”.

@Keisuke_Miyako, yeah the idea was to export the changes of 4d records over time. We’ve actually made a script that would export all 4BL in a folder to a database (to postgres database, it was easier to query) where we can analyse when the record was changed, and to what value. Log file to json was the only 4d inbuilt command that allows ‘export’ of journal file content.
During an investigation of some data integrity issues a couple of weeks ago, I’ve noticed that object fields were not present in output of 4d log file to json, and that’s the field we wanted to track the changes on.

Anyways, I’ve also made a feature request for this.

And of course we can make our own mechanism to record records changes (with triggers), but don’t want to reinvent the wheel, since journal file contains the information, it’s would be good to use it.

Is there any way to lookup and load the BlobID number into a blob? The documentation example shows a path for a picture field. If you can find the blob, I suspect you could read it back into an object using BLOB TO VARIABLE and then use JSON Stringify to export it yourself.

Andrei,
Sorry: I didn’t get to respond to you yesterday.
I was thinking that LOG FILE TO JSON would export Object fields: but as you’ve observed - it does not. (I tested it just now).
I have a tool that I’ve written that’s (loosely) on the market called LogTools (not sold in stores), that allows you to open, and browse/filter 4D Journal files AND export them to CSV: including object fields. If you really need something to do the job: LogTools may fit the bill.

1 Like

@DeSoi.John, since we want to see the historic change of object field, I don’t think there is anyways to find it in the database (i.e. the value changes are stored in the journal file)

@Ringsmuth.Tony, I’ve tried LogTools 8.2.3 (it’s the one by Business Brothers right ?), it doesn’t export Object fields to CSV (unless I am doing something wrong), all I can see is: Object: length -60 for the test that I did. Is that the expected behaviour or am I doing something wrong ? (using in record setting for the object field btw)

I was referring to finding the object field data in the binary journal file. The changed object field has to be in there, otherwise there would be no way for the log file to bring the data file up to date.

I looked at it briefly with a hex editor and I can tell where an object field is being stored. Unfortunately, it is not the same format as you see with VARIABLE TO BLOB; writing something to extract it would be a lot of work.

Andrei,
I’ve verified in my latest version (8.3.1) of LogTools - that yes - it can export object fields.
(Unfortunately, I do very little with LogTools now-a-days, so the latest version is NOT on my downloads site at the moment)
If you need it: Please tell me, and I’ll get the web updated with the latest.

@DeSoi.John
I see what you mean, I went down that path too (of opening journal in hex editor), and also saw the ‘bits’ of the object and arrived at the same conclusion (would take a bit of time to figure out how 4D stored the object), and that’s where I turned to 4D forum, was thinking someone must have had similar need before and could share their wise solution.
@Ringsmuth.Tony
Thanks for the info. If there was any ‘programatic’ way to run LogTools on a journal and export to csv, we would be quite interested. (manually opening journals with LogTools, and exporting to csv is not an option, i.e. when we have to go back through 3-4 months of history with a lot of .4BLs, any manual process would be too time consuming)

Just out of the interest (and I do realise LogTools is not open source), how were you able to extract binary object fields from the journal file ? Don’t have to tell exactly how you did it, but maybe some hints or references ? How did you figure out the exact format that 4D uses to store object fields in binary format ?

I don’t endorse this kind of reverse engineering, honestly I feel like this is a misuse or abuse of the command LOG FILE TO JSON, but as a pure exercise…

All BLOBs (text, picture, object, BLOB) can be stored in 3 different ways:

  1. In record
  2. In data file
  3. In external file

The corresponding JSON log representation for each are:

  1. "Blob in record: "
  2. "BlobID: {bid}"
  3. "BlobPath: Table {tn}/Field {fn}/Data_{uuid}.blob"

For #2 and #3, there is also an

  • "operationType:27" (create BLOB)

or an

  • "operationType:28" (update BLOB)

that has a

  • "dataLen":{len},"blobNumber":{bid}

but you only have a

  • "BlobPath: Table {tn}/Field {fn}/Data_{uuid}.blob"

if the storage option designates an external file.

In other words:

For #2, you have the length but no data.

For #1 you have nothing, not even have the data.

As for the content of the .blob file, you just have to create a minimal object like

$o:=New object("child";New object("parent";null))
$o.child.parent:=$o

store it, read it

$path:=System folder(Desktop)+"Data_{uuid}.blob"
$data:=File($path;fk platform path).getContent()

compare it with the content from VARIABLE TO BLOB, and you will notice that after the first 22 bytes, the contents are identical.

1 Like