0

Trouble deleting all records in very large table

I often have tables with many records, possibly hundreds of thousands, and I occasionally might want to clear out the tables so I can reload them from scratch (e.g. re-import from refreshed CSV files).

In the MacOS native Ninox app, if I select all records and then try to delete, then it just sits and spins or it just immediately returns without actually deleting any records.

If I select, say, 10,000 records at a time, then I can delete them once or twice, but eventually it just hangs with a spinner and I have to force quit.

If I write a button script in another table where I do something like "delete select tblnam", then it generally just hangs with a spinner when I click the button.

This has been a problem for me forever and I keep working around it in various clumsy time-consuming ways, but it seems like there must by now be some way to clear all the records from a very large table? Am I missing something?

5 replies

null
    • Tim
    • 10 days ago
    • Reported - view

    I received a suggestion from support@ninox.com to delete in even smaller batches of 1000 records at a time, with the assumption that I am having some kind of resource exhaustion on my local machine (memory?).

    I then wrote a button script that deletes a 1000 at a time until there aren't any left and it does run to completion and in a reasonable amount of time. With this approach, I have deleted all the records in tables containing over 200,000 records.

    Is there something in the product development queue to enhance the delete feature to better manage local resources to avoid this exhaustion and to eliminate the necessity for special scripting?

    • Fred
    • 10 days ago
    • Reported - view

    Which model of mac do you have? How much RAM do you have?

      • Tim
      • 7 days ago
      • Reported - view

       MacBook Pro 13" M1 2020 w/16 GB memory

    • Fred
    • 10 days ago
    • Reported - view

    Can you figure it into your work flow where you delete files sooner in smaller batches?

      • Tim
      • 7 days ago
      • Reported - view

       unfortunately my workflow is that I load some pretty large tables (by importing CSV files pulled from a non-Ninox database), then I do some analysis, then at some later date I need to replace all the data with new imports to do another analysis. So, I just need to be able to efficiently clear out very large tables before I do the CSV imports with the new data pulls.

Content aside

  • 7 days agoLast active
  • 5Replies
  • 41Views
  • 2 Following