0

Running a script for each record in a subtable...

I have a table on which I have a button to run a script..

The script needs to execute some code based on each of the visible records in the subtable (which are linked to the current table).

How do I run a 'select' statement on only the records linked to the current table record (client) and not ALL of the images to all of the clients?

eg: Main table = list of clients
Linked (sub)table = list of images specific to the current client

Thanks in advance,

J. 

19 replies

null
    • Fred
    • 2 yrs ago
    • Reported - view

    If you want to change the data in a field in all records of a related table, you can do something as simple as:

    let a := "New Text"
    ReferenceFieldName.textfieldname := a

    As you can see you only need to use the name of the reference field and Ninox will automagically find only the related records. Nice and easy. No select statements. No hassel.

    Then we add a period and can start accessing any field in that table. If there were another reference field in the related table then we can then follow that link as well.

    • John_Halls
    • 2 yrs ago
    • Reported - view

    Further to  Fred , if it's a block of code you need to run use a for loop

    for a in ReferenceFieldName do
    <<Code Block>>
    end
    

    Any reference to a field in ReferenceFieldName is prefixed with an a plus a period

    for a in ReferenceFieldName do
    a.Field1 := a.Field1 + a.Field2
    end
    

    Regards John

    • jason.1
    • 2 yrs ago
    • Reported - view

    Hi both.. Got it to work!
    Thanks Fred  and John Halls  - Really appreciate your help!  

    I don't however need to write any changes to the fields - I'm trying to retrieve a URL from a field in the subtable for each visible record, then pass that URL to an OpenURL command.
    My mistake was treating the Reference Field as a stand-alone table, rather than as a reference field!

    Thanks guys!
     

    • Fred
    • 2 yrs ago
    • Reported - view

    can you post the code you are working with and the names of the fields that links the tables?

    or 

    post a copy of your db.

      • jason.1
      • 2 yrs ago
      • Reported - view

      Fred Sorry Fred - I've edited the post now..  I couldn't get it to work at first, realised my mistake and edited the post before I sent it, not realising I'd left the first line in saying I couldn't get it to work..  it IS working now, just as you both suggested..  Thanks again! 

    • Fred
    • 2 yrs ago
    • Reported - view

    if you are comfortable, please post the final code so others can learn from your success.  

      • jason.1
      • 2 yrs ago
      • Reported - view

      Fred I will once I've tweaked it.. I'm struggling to get the For in loop to only work on unique records..  🤔

      • Fred
      • 2 yrs ago
      • Reported - view

      jason We are happy to help with the for loop as well.

      • jason.1
      • 2 yrs ago
      • Reported - view

      Fred Really appreciate it - thank you..  

      I'm trying to work only on unique URLs but can't get the code to work:

      for a in unique(this.'Claim Infringement'.'Infringing Page URL') do
          openURL(a.'Infringing Page URL');
      end;
      

      I have a Referencefield, 'Claim Infringement' and I only want the code block to work once where there are duplicate URLS..  

      eg: 

      http://junction10.photography/contact
      http://junction10.photography/terms
      http://junction10.photography/contact
      http://junction10.photography/live-feed

      I ONLY want the code block to work on:

      http://junction10.photography/contact
      http://junction10.photography/terms
      http://junction10.photography/live-feed
      

      and to ignore the duplicate contact URL..

      I'm unsure how the syntax works for the for loop, as the 'OpenURL' line has two  errors, 'Expression does not return a record' and field not found ('infringing Page URL') - but I believe the error is actually in my For / Do syntax...  

      Thanks!
       

    • Fred
    • 2 yrs ago
    • Reported - view

    Is this code in a button?

    If so, when you click on the button do you want the DB to open each link?

      • jason.1
      • 2 yrs ago
      • Reported - view

      Fred Sort of - yes, the code is attached to a button..  the finished code will actually call the WayBack machine, so I can archive the URLs on web.archive, so I'll actually be doing:

      openURL("https://web.archive.org/save/"+'Infringing Page URL');

      I think the browser window needs to remain open while the Wayback machine archives the page, so I don't need it to close immediately..  I'm happy to do that part manually. 

      The Wayback machine limits requests to 15 per minute, and there is no point in trying to save the same URL more than once in any session (once per day is usually more than enough).

      But some of the claims in my database have multiple URLs - in excess of 15 - so I need to make sure I don't exceed 15 requests, and unless I get an absolutely huge claim to deal with, just making sure that I only access unique URLs will do..  for the time being..  

      Thanks,

      J

    • Fred
    • 2 yrs ago
    • Reported - view
    for a in unique(this.'Claim Infringement'.'Infringing Page URL') do openURL(a.'Infringing Page URL'); end;

    In line 1, you use the this command. I don't think you need it as you are using links to get to your actual data.

    Since you point to a specific field, so you can do a unique on it, you lose any linkage to the record. If you created a new formula field and just put in:

    'Claim Infringement'

    you will see a list of related record Ids. So if you stayed at this level then you can step down into the fields.

    But since you went down to 'Infringing Page URL', you now just have an array of URLs. That is why Ninox is saying it can't find the field because it doesn't exist.

    Your new code could look like:

    for a in unique('Claim Infringement'.'Infringing Page URL') do
    openURL(a)
    end

    My testing verifies that this basic code will filter out an duplicate URLs and then open up a browser tab for each URL in the array.

    If you want to throw in some error checking to make sure you don't go over 15 url limit. You can do something like:

    let array1 := unique('Claim Infringement'.'Infringing Page URL');
    let cntURL := count(array1);
    if cntURL > 15 then
        let msg1 := dialog("Warning", "There is more than 15 records do you want to continue?", ["Yes", "No"]);
        if msg1 = "Yes" then
            for a in array1 do
                openURL(a)
            end
        end
    else
        for a in array1 do
            openURL(a)
        end
    end
    • jason.1
    • 2 yrs ago
    • Reported - view

    Brilliant, thanks Fred ..  That works wonderfully!
    I wish I could find my way through the documentation more easily!  :)

     

    Thanks for your help!

      • Fred
      • 2 yrs ago
      • Reported - view

      Glad things are working. 

      Their documentation is horrible. I’ve learned by watching the Nioxus videos and this forum. 

    • jason.1
    • 2 yrs ago
    • Reported - view

    Fred - I'm trying to further expand on this and appreciate your thoughts on this...

    I have the 'Claim' table, which has the subtable 'Claim Infringements' - these are the numerous infringements that make up each individual claim.

    I have another table called "ArchiveHistory" which stores the date and time, and method of archiving..   (On the 'Claim Infringement' table form, I have buttons to archive the individual 'Infringing Page URL' to WayBack & Devonthink, and then create a record in the ArchiveHistory table, that all works hunky-dory...). 

    On the main Claim table, I have two buttons - one to save ALL the 'Claim Infringements' URLs to WayBack machine, and another that saves them all to DevonThink Pro on my local storage. 
    (I've kept them separate so I can abuse my local storage without clogging up Wayback machine unnecessarily).

    Clicking the 'DevonThink" button will correctly save all the links, and then create a new record in the "ArchiveHistory" table so each individual infringement is listed with the date and time it was archived.

    That code works fine:

    " SAVE ALL TO DEVONTHINK
    
    If we don't know which container it is to be saved in, then set the switch
     to popup the destination selector";
    
    let noSelector := "Yes";
    if 'DevonThink Group UUID' = null then
        noSelector := "No"
    else
        noSelector := "Yes"
    end;
    
    for a in 'Claim Infringement' do
    
     "This block saves the URL in the variety of formats to Devonthink";
        "Save As Paginated PDF first";
        openURL("x-devonthink://createPDF?title=Page  #" + format(number(a.Ref), "000") + ": URL:" + a.'Infringing Page URL' + "&location=" + a.'Infringing Page URL' + "&referrer=" + a.'Infringing Page URL' + "&paginated=Yes&noSelector=" + noSelector + "&width=1600&destination=" + 'DevonThink Group UUID' + "&tags=" + replace('Claim Ref', "/", ":") + "," + Clients.'Short Code' + "," + "Paginated PDF");
        "Save as single page PDF";
        openURL("x-devonthink://createPDF?title=Page  #" + format(number(a.Ref), "000") + ": URL:" + a.'Infringing Page URL' + "&location=" + a.'Infringing Page URL' + "&referrer=" + a.'Infringing Page URL' + "&paginated=No&noSelector=" + noSelector + "&width=1600&destination=" + 'DevonThink Group UUID' + "&tags=" + replace('Claim Ref', "/", ":") + "," + Clients.'Short Code' + "," + "1 Page PDF");
        "Save as Web Archive";
        openURL("x-devonthink://createWebArchive?title=Page  #" + format(number(a.Ref), "000") + ": URL:" + a.'Infringing Page URL' + "&location=" + a.'Infringing Page URL' + "&referrer=" + a.'Infringing Page URL' + "&noSelector=" + noSelector + "&width=1600&destination=" + 'DevonThink Group UUID' + "&tags=" + replace('Claim Ref', "/", ":") + "," + Clients.'Short Code' + "," + "Web Archive");
        "Save Image URL as PDF and as raw image (to check metadata)";
        openURL("x-devonthink://createPDF?title=Image #" + format(number(a.Ref), "000") + ": URL:" + a.'Infringing Image URL' + "&location=" + a.'Infringing Image URL' + "&referrer=" + a.'Infringing Image URL' + "&paginated=Yes&noSelector=" + noSelector + "&destination=" + 'DevonThink Group UUID' + "&tags=" + replace('Claim Ref', "/", ":") + "," + Clients.'Short Code' + "," + "Image PDF");
        openURL("x-devonthink://createBookmark?title=Image #" + format(number(a.Ref), "000") + ": URL:" + a.'Infringing Image URL' + "&location=" + a.'Infringing Image URL' + "&source=" + urlEncode(a.'Infringing Image URL') + "&referrer=" + urlEncode(a.'Infringing Image URL') + "&noSelector=" + noSelector + "&reader=1&destination=" + 'DevonThink Group UUID' + "&tags=" + replace('Claim Ref', "/", ":") + "," + Clients.'Short Code' + "," + "Image Bookmark");
    
    "This block creates a record in ArchiveHistory table, linked to the infringement so we
    can see when it was archived.";
        let CurRec := a;
        "a is the reference link to claim infringement'";
    
        let NewRecord := (create ArchiveHistory);
        NewRecord.(ArchiveDate := now());
        NewRecord.(ArchiveTime := now());
        NewRecord.(ArchiveDateTime := now());
        NewRecord.(Method := "Batch");
        NewRecord.(Archive := "DevonThink");
        NewRecord.('Claim Infringement' := a)
    end
    

     
    The Wayback code is different (not to mention, yours :)), because it's important to only save each URL once and to reduce the impact of multiple requests exceeding the 15 per minute limit.
    (It would be useful to improve the Devonthink code above to save each PAGE URL per claim only once too, but since there could be numerous IMAGE URL's on the same page, I can't work on unique PAGE URL's for that, as I have to save each image that appears on the same page.
    Wayback however automatically archives all the images from each page submission, so we don't need to worry about that - just needing to archive each individual page only once per session:

    let array1 := unique('Claim Infringement'.'Infringing Page URL');
    let cntURL := count(array1);
    if cntURL > 15 then
        let msg1 := dialog("Warning", "There are more than 15 records (" + cntURL + ") do you want to continue?
    
    Archive.org limits requests to 15 per minute. You should consider manually archiving these individually to avoid IP block.
    
        Continue?", ["Yes", "No"]);
        if msg1 = "Yes" then
            for a in array1 do
                openURL("https://web.archive.org/save/" + a);
                "Need to create new 'ArchiveHistory' record here!";
            end
        end
    else
        for a in array1 do
            openURL("https://web.archive.org/save/" + a);
            "Need to create new 'ArchiveHistory' record here!";
        end
    end
    

    The code I used to create a new record for Devonthink doesn't work in this code for Wayback - presumably because I'm working on an array of a specific field rather than a table reference...

    How would be best to create a new record in the table "ArchiveHistory" based on each record of the 'Claim Infringement' reference field in this example?

    Many thanks,

    Jason
     

    • Fred
    • 2 yrs ago
    • Reported - view
    jason said:
    The code I used to create a new record for Devonthink doesn't work in this code for Wayback - presumably because I'm working on an array of a specific field rather than a table reference...

     You are correct. The variable array1 only contains URL data not record data.

     

    jason said:
    How would be best to create a new record in the table "ArchiveHistory" based on each record of the 'Claim Infringement' reference field in this example?

     Just to make sure I understand this question, you want to make a record in ArchiveHistory for each record in 'Claim Infringement' that is array1.

    If that is so then I have no idea. ☹️

    It is because we have lost all reference to the record ID when we do a unique that I have not found a work around for.

    What just came to me is to create a record for all related records in 'Claims Infringement' and then have another piece of code that finds duplicates and deletes the last one.

    Is that something that would be OK with you?

      • jason.1
      • 2 yrs ago
      • Reported - view

      Fred Ah..  I get you..  so effectively have the same loop as Devonthink working on ALL the URLs, but without any of the saving - create the new "ArchiveHistory" record in that block (that would also ensure that the duplicate URLs are all marked as archived, which is true), then have the separate loop below it which only saves the unique URLs...  That might work?

    • Fred
    • 2 yrs ago
    • Reported - view

    Or this could work as well, put it right below the openURL.

    let array2 := first((select 'Claim Infringement')[URL = loop1]);
    let newRec := (create ArchiveHistory);
    newRec.(
           URL := loop1;
           ClaimInfringementlinkname := array2
            )

    Line 1 creates a new array that finds all records from Claim Infringement that has the same URL as the loop we are in. For most instances it will only find 1 record. But for moments that there are multiple records, then we take only the first record.

    Lines 3 - 5 creates the new record in ArchiveHistory, copies the URL and links the new record in ArchiveHistory to Claim Infringement.

    I hope you only care about linking to the first records that has the duplicate URL.

      • jason.1
      • 2 yrs ago
      • Reported - view

      Fred Thanks Fred..  Sorry for late reply - been away on business, but returned and tried that..  works great!  Much appreciated!  👍

Content aside

  • 2 yrs agoLast active
  • 19Replies
  • 550Views
  • 3 Following