Ninox script to work with csv files
Ninox has a convenient wizard for importing (.csv) data which is well featured and easy to use.
If you regularly work with data contained in csv files, you may be keen to to avoid retracing your steps in the import wizard for each import.
You can use Ninox Script to read, act upon and create records based on the content of .csv files.
I recommend you create a table designed to manage your imported (attached) files. My table looks like this:
In my example, I have a reporting task to complete every fortnight based on data contained in .csv files obtained from a number of sources. This table is useful for a number of reasons, giving me a place to store similarly named .csv files for each period (my data sources), allowing me to pass the period or dates as columns to the script to differentiate data by batch (case I need to roll back) and to allow for the calculation of year-to-date analyses.
I simply select a record, then attach the files I need to process, like so:
So far, so good.
You may have noticed I added a button called Process Data on the form above. In my actual database I use a single script to:
- to check if the process has already been run (completed) for the period;
- to process the files;
- then mark the process as complete in the table, so the process is not run in error more than once.
You'll have your own unique process so I've left steps 1 and 3 out of my example script - I've also made it easier to understand by dealing with a single file, and I've not added a reference to the period (because that might differ from your planned workflow).
So without further ado, here is a script to process data from the uploaded csv file, creating records for each line of the file in a table in Ninox. I've documented the script heavily because it helps new users. You can largely ignore the debug parts other than they help when troubleshooting new files...
---
ninox cloud (for use with ninox public or private cloud)
mass table operations are most efficiently performed on the server
---;
do as server
---
let tDebugOutput := "";
---;
---
create a reference (or link) to a .csv file attached to the current record
---;
let tShareURL := shareFile(this, "cos_final.csv");
---
create an array of lines from the file using split by urlencoded line endings
---;
let tLines := split(urlEncode(concat(http("GET", tShareURL).result)), "%0A");
---
count the number of rows in the array containing the contents of the .csv file
the count is used when looping through the lines of the file below
---;
let tCount := count(tLines);
---
let tFirstDataLine := item(tLines, 1);
tDebugOutput := tDebugOutput + " | " + tFirstDataLine + " | ";
---;
---
initiate counter i
let i := 0; if header row is not present in the .csv file.
let i := 1; if header row is present in the .csv file.
---;
let i := 1;
---
while the value of the counter is less than the number of lines in the file continue to process the lines of the file
---;
while i < tCount do
---
assign the data from line i of the .csv file to variable tLine
---;
let tLine := item(tLines, i);
---
create an array of columns from a line in the file using split by urlencoded comma
---;
let tColumns := split(tLine, "%2C");
---
assign values each column to variables that match columns in table cos
---;
let tNdisNumber := urlDecode(item(tColumns, 0));
---
tDebugOutput := tDebugOutput + i + " | " + tNdisNumber + " | ";
---;
let tNoteDate := urlDecode(item(tColumns, 1));
---
tDebugOutput := tDebugOutput + i + " | " + tNoteDate + " | ";
---;
let tNoteSupportItem := urlDecode(item(tColumns, 2));
---
tDebugOutput := tDebugOutput + i + " | " + tNoteSupportItem + " | ";
---;
let tManagementStyle := urlDecode(item(tColumns, 3));
---
tDebugOutput := tDebugOutput + i + " | " + tManagementStyle + " | ";
---;
let tNoteDurationalDecimal := urlDecode(item(tColumns, 4));
---
tDebugOutput := tDebugOutput + i + " | " + tNoteDurationalDecimal + " | ";
---;
let tNoteRate := urlDecode(item(tColumns, 5));
---
tDebugOutput := tDebugOutput + i + " | " + tNoteRate + " | ";
---;
let tNoteFee := urlDecode(item(tColumns, 6));
---
tDebugOutput := tDebugOutput + i + " | " + tNoteFee + " | ";
---;
let tClient := urlDecode(item(tColumns, 7));
---
tDebugOutput := tDebugOutput + i + " | " + tClient + " | ";
---;
let tCos := urlDecode(item(tColumns, 8));
---
tDebugOutput := tDebugOutput + i + " | " + tCos + " | ";
---;
let tCosRate := urlDecode(item(tColumns, 9));
---
tDebugOutput := tDebugOutput + i + " | " + tCosRate + " | ";
---;
let tCosFee := urlDecode(item(tColumns, 10));
---
tDebugOutput := tDebugOutput + i + " | " + tCosFee + " | ";
---;
---
create a new record in table cos for the current row from the .csv source file
---;
(create cos).(
ndisnumber := tNdisNumber;
notedate := tNoteDate;
notesupportitem := tNoteSupportItem;
managementstyle := tManagementStyle;
notedurationdecimal := tNoteDurationalDecimal;
noterate := tNoteRate;
notefee := tNoteFee;
client := tClient;
cos := tCos;
cosrate := tCosRate;
cosfee := tCosFee
);
---
increment counter i + 1
---;
i := i + 1
end
;
---
remove the reference (or link) to the .csv file attached to the current record
---;
unshareFile(this, "cos_final.csv")
end;
---
alert(tDebugOutput);
---;
---
display an alert to the user - process complete
---;
alert("Process complete.")
Don't forget to include 'do as server'. If you don't your scripts may run for hours.
Processing of the data now takes seconds. Hooray!
I'd also recommend using a dedicated .csv viewer (like Modern CSV or similar) when working with .csv files, as Microsoft Excel has a tendency to re-format raw .csv data in unexpected ways (especially date and time formats).
One more thing...if you wish to transform your .csv data using SQL before importing to Ninox check out https://hub.steampipe.io/plugins/turbot/csv . It's a command line utility that allows you to query .csv files as if they are postgre database tables using standard SQL. You can even create new .csv files from query output.
Enjoy!
Kind regards, Andrew
19 replies
-
Very Interesting!
Didn't know you could comment using --- ---;
-
First Class - Useful and Practical. Nice one
-
Hi Andrew. I learned so much from your post here. I had to look a few things up especially shareFile and unshareFile which I hadn't heard of before. Call you tell me, what would happen if you didn't include the unshareFile command?
-
Hi, I could use a push in the correct direction here. I do have a .CSV file attached to the current record. When I create the array I return only one (1) line and it appears to be metadata type html and not the body of the attached file itself. I do know that the attached file contains many records. Thanks in advance.
-
Hello Robert, from what you shared, my guess is that you were running that script on the Ninox Mac App and not in the Public or Private cloud. Is that correct?
-
Hello Robert, apologies for the delay in replying. I believe shareFile is a Ninox Cloud (Public/Private) only function as it creates a URL reference to the file on the Ninox Cloud with prefix "https://share.ninox.com/". In the comments of my original post, I noted:
said:
for use with ninox public or private cloudI will download the latest Ninox Mac app, and double-check what is possible later today.
Kind regards, Andrew
-
really nice work here, very useful.
i have a csv file which contains text including linebreaks. so this kinda messes up the import. any way to maybe replace the linebreaks in the entire column before running the import? has to be before
said:
let tLines := split(urlEncode(concat(http("GET", tShareURL).result)), "%0A");i guess?
-
Hi , wow i’m impressed, thank you so much for the explanation. Got this to work now :) amazing job. thank you for your time andrew
-
Hi all,
Has Anyone had positive result in Clound version ?
Content aside
-
4
Likes
- 9 mths agoLast active
- 19Replies
- 659Views
-
7
Following