0

Data in the OrderItems table changes on its own

Hello,

I just noticed a weird think happening to my database. Today I was working on the CRM database in the office where I work in the cloud version (since they have windows OS).

After reaching home now (as I am writing this) I took backup of the cloud database, downloaded the backup and imported the archive on my macOS Ninox app.

But to my surprise the Order Items in the last Sale Order have changed on its own. At first I thought it was some glitch, so I deleted the database from my Mac, made a fresh backup on the cloud version and then imported it again on my Mac. This time again in the same record two items in the Order Items table were changed (same as old ones). I am attaching the screenshots of both the databases (Cloud version and Mac version).

I have checked other Sale Orders also and this has happened in other records as well.

How did this happen. is it 'cuz on cloud I am using 3.13.1 and on Mac app it is still 12.8.1?

What I am noticing is that it is only the name of the Order Item that is different, rest all data like quantity, unit price etc are all correct. Even the product name is correct. How is it picking up wrong items, but rest all details are correct?

This is serious data glitch on real sales data. Does anyone have any idea how to prevent this from happening in the future ? 

13 replies

null
    • Fred
    • 2 wk ago
    • Reported - view

    what kind of field is Item Name in Order Items?

      • Database App Developer
      • vermau81
      • 13 days ago
      • Reported - view

       'Item Name' is a Dynamic Choice field which shows the Product Items based on the Product that is chosen in the Product field (which is a reference field).

      Ninox support gave me this explanations of Data Glitch:

      Hello Verma,

      Thank you for your message.

      Could it be that you are determining the items via a dynamic choice field? Please keep in mind that in the background, a dynamic choice field just contains a number (or an array of numbers for multiple choice) that represent the ID(s) of the chosen record(s).

      When a database is backup up manually and downloaded, then reimported somewhere else, gaps in IDs are closed as the structure of the database is defragmented. That means if the existing records in a table had the IDs 1, 2, 4, 5, 8 and 9 (because 3, 6 and 7 had been deleted at some point), the records in the newly imported version will instead have the IDs 1, 2, 3, 4, 5 and 6 again.

      That means that a dynamic choice field which had record 4 selected before just contained the number 4. Because the IDs in the newly imported database have now shifted around, record 4 is now what record 5 was before the migration. That is also likely why all the other data fields were untouched, as their contents are not dynamically determined.

      If this is applies to your situation, you will have to manually switch the affected values around to solve it. To prevent this from happening we suggest that you do not employ dynamic choice fields in your order items and instead let static data fields be filled via triggers, as those will then be unaffected by migration and defragmentation.

      Should I have misdiagnosed the issue, or if you have any further questions, please do not hesitate to contact us.

      Kind regards

      Seriously, I don't understand any of this. The ID of a record is supposed to be an imprint/snapshot for that record. It is the very BASIC IDENTITY of that record. It's like a SSN of a person and you are telling me that the SSN changes as whenever the person changes his address to a different city and all the benefits that come with his SSN get compromised.

      How can a database be so NAIVE. I have worked with dozens of databases in the past and never ever faced any situation like this. Big corporations take database backup almost daily and they use those backups quite often when required.

      This is a serious DATA GLITCH. My boss's business depends on this financial data. He has his own cloud account and I have mine. All I am doing is just taking a backup of the database and using that backup to work at home. What kind of backup is this that changes data based on the type of STORAGE.

       

      Let me CONFIRM this here that when I upload the backup at the same time to my iCloud and Ninox Cloud, database in my iCloud shows data glitch, however database in Ninox Cloud stays perfect. So now you mean to say that the GLITCH has its own taste and it choses when to react abnormal.

      I don't understand how can a Dynamic Choice Field contain the ID of a record. It only contains some list of values for every record. It is just a column in a record.

      What am I doing wrong here. I am just Filtering the Product Items based on the type of Product chosen to sell. Product is a Reference field and as soon as we choose a product, the current record comes in CONTEXT and so we just choose the Product Items based on the current record in Context. I mean isn't this the whole IDEA of Ninox, make the whole table (columns) available as a reference to choose from.

      If this is how Ninox works, then I guess we have to go the old school way of having the PRIMARY KEY in every table and filter the records based on the KEY.

      Can someone please give me a logical insight of what is happening and why is it happening 'cuz it happens only in iCloud and not in Ninox Cloud.

      Also this started happening with 3.13.1 This never happened when I was using 3.12.8 or even 3.13 'cuz I have been doing this (download and upload) since last 1 month and never faced any data glitch.

       

       

      • Alain_Fontaine
      • 13 days ago
      • Reported - view

       I have done this - I mean making a backup on one machine and loading it on another - and never observed the so-called "defragmentation", where the records are renumbered with consecutive numbers. Now, if Ninox has decided to perform such a "defragmentation", there is nothing inherently evil, but then the procedure must adjust all the references, of all kinds, everywhere in the database. Just imagine a disk defragmentation utility that would move the sectors containing the data, but without updating the pointers organizing the files… 

      • Database App Developer
      • vermau81
      • 13 days ago
      • Reported - view

       That is exactly my point. This is not a small bug that people can live with for some time. This is data breach. Thank God that there were only a few entries in the Sale Order as the sales guy is new and doesn't have many sales, so why eyes caught the discrepancy in data. If there would have been a couple of hundred entries, no one could have noticed what was happening. This is Financial data.

      Ninox has to fix this asap.

      For now I am working on Ninox cloud on both the accounts (my personal and the office account) and I purposely downloaded the database from office Ninox cloud account and uploaded to my personal Ninox cloud account to see if the anomaly happened this was as well or not and there was no anomaly.

      So I can say that the problem is with database being uploaded from Ninox cloud to iCloud and that too in Dynamic Choice Fields or any other elements which hold or are based on references of records in the table. 

      • Fred
      • 13 days ago
      • Reported - view

      Sadly I've reported about this over a year ago. Lucky for me I was only using the dynamic choice field as a UI element and not as real data.

      This post can help you with the code to put in your trigger after update of the dynamic field.

       said:
      I don't understand how can a Dynamic Choice Field contain the ID of a record. It only contains some list of values for every record. It is just a column in a record.

      All dynamic choice fields are made up of records from a table, that is why you have to do a select or use a reference field to build your choices. You can have it show any field from the table you want. It can be one field or a concatenation of numerous fields. In the end Ninox is only tracking the record Id, not what is displayed to the user.

      There should be some warning in their docs about this issue if they feel like it is something that can't be "fixed".

      • Database App Developer
      • vermau81
      • 12 days ago
      • Reported - view

       Two things Fred:

      1. Data glitch only happens when the archive is imported and uploaded to either the local hard drive or personal iCloud account. When the archive is uploaded to Ninox cloud, the data remains intact. I can give proof of this. So clearly this is a bug in the software somewhere which the company is too lazy and lousy to find and fix (specially if you reported this a year ago) or this might be company's way of getting people to buy the Ninox cloud subscription.

      I guess Ninox is choosy about where the archive is imported and decides to scramble the data based on the upload location. So in other words this software is just a fancy looking TOY that can be used to do some cosmetic things around the data, but cannot be relied upon for serious development. This software is a piece of JOKE.

      There is no point showing fancy concepts like providing whole table available so that one does not have to rely upon the PRIMARY KEY when the root code of the software is broken. DEFRAGMENTATION or not, the DATA should remain INTACT.

      2. There are close to 1000 product items that have been categorised under different product lines. So each Product has few hundred product items under it. When a sale happens, person first chooses a product and based on that the Dynamic Choice field shows the respective Product Items.

      I believe this was the basic idea of building this element (dynamic data on the go as per the requirement).

      If in the end we have to use the SELECT statements to derive the data all the time, then I guess old TIME TESTED SQL and other database software are only trustworthy. Ninox can DITCH anytime. Data was small so I caught the anomaly. People with massive data are not aware of this data glitch as we speak.

      I really liked Ninox's concept of making the whole table available for reference as compared to other software like Foxpro, Oracle, MySQL, Filemaker etc. but if this happens at the cost of data integrity and security, then it is NOT WORTH IT.

      No doubt Ninox seriously reduces the development time, but this is crazy. Filemaker has a steep learning curve but it is SOLID and TIME TESTED.

      I have been asking my boss to pay for the subscription as the trial period is almost getting over and I am getting deep into the CRM I am building for him. But now I think that the onus will be on me if things go south ways in the future.

      I don't want to leave Ninox. I don't understand what should I do at this stage !! I have put both my projects development is at HUALT.

      • Alain_Fontaine
      • 12 days ago
      • Reported - view

       Arguing with oneself can be seen as the sign of a serious mental disorder, but here it goes anyway… Thinking further, I can’t see how such a "defragmentation"  could be made to work flawlessly. For example, one can store the numeric identification of a record in a plain "Number" field, to be reused later. How could a "defragmentation" procedure reliably detect such cases to properly correct the numeric values? So my final conclusion is: NEVER touch the ids of the records in any circumstances. The gaps in numbering are just a cosmetic non issue. There is still room for a "cleaning" action, to get rid of the garbage that Ninox tends to accumulate over the life of a database.

    • szormpas
    • 13 days ago
    • Reported - view

    "When a database is backup up manually and downloaded, then reimported somewhere else, gaps in IDs are closed as the structure of the database is defragmented."

    I'm a bit worried about this wording! If it's true, it applies to all users who keep manual backups.

    We can't rely on the IDs of the records because they might change when we need to import from the backup. For example, I use a table for the settings in my database and use the IDs to reference each setting.

    Also, does the above apply if we click "Defragment database" in the Options menu?

      • Fred
      • 13 days ago
      • Reported - view

      You would only see this in root tables that have deletions as now gaps have been created and defragging closes those gaps.

      You should ask Ninox about the manual defrag operations.

    • Fred
    • 12 days ago
    • Reported - view
     said:
    I don't want to leave Ninox. I don't understand what should I do at this stage !!

    One option is to create another tables that stores the selection made in the dynamic choice.
     

    I posted a link to a post that deals with this in an earlier reply. 
     

    Thus you don’t depend on the dynamic field for data just for UI. 
     

    By copying the selection to a table, you also have better access to the data. Trying to get data off a dynamic field just complicates matters. 
     

    This method doesn’t change the front end. User still see the choice field, but you know on the back end to look at the table that stores the selection. 

      • Database App Developer
      • vermau81
      • 12 days ago
      • Reported - view

       Or we can just use SELECT queries combined with WHERE clause to fill the dynamic field. This way no matter how many times the data gets scrambled 'cuz of defragmentation, it will always be fetched fresh using SELECT whenever the form is executed. I hope I am right. This way we dont have to create another table to just store the selected data. The data will always be selected on the go.

      But no matter what solution we implement, at the core it kills the whole purpose of Dynamic Choice Fields. They should remove these elements or fix their working logic.

      Please advise me if I am wrong about using SELECT statements. I just want to make sue that I am on the right path moving forward. 

    • Fred
    • 12 days ago
    • Reported - view
     said:
    Please advise me if I am wrong about using SELECT statements.

     I am not knowledgeable at all of how Ninox's backend works, so I can't say if using select statments will make any difference. I suspect not, if my wee brain can grasp what is going on.

    Just a thought, but Ninox may not care about the app having problems with downloaded DB from the cloud since the app is free. You say you have no issue in the cloud so that is where, I suspect, they care most. If the app had issues with icloud or local dbs then that would be a bigger issue.

    I was shocked and angry too in the beginning. It has reared its ugly head again, but that could be I downloaded the DB and re-uploaded it to reduce the DB size. But since I don't really depend on it, I don't really care.

    Good luck in finding out a solution that works for you.

      • Database App Developer
      • vermau81
      • 12 days ago
      • Reported - view

       There is definitely something wrong with the code in the latest version 3.13.1 that it treats the archives differently when uploaded to iCloud or local machine and Ninox cloud.

      i will try with SELECT at least once and see if it works ‘cuz even if we store the data in that accessory table, we will still have to use SELECT to fetch the data out of that table and to auto fill the other elements on the form or use it elsewhere.

      Thanks for your time and effort Fred. 

Content aside

  • 12 days agoLast active
  • 13Replies
  • 81Views
  • 4 Following