The FileMaker JSON functions we have in FileMaker 16 have changed the development game for many folks. JSONSetElement, JSONGetElmeent, JSONDeleteElement, JSONListValues, JSONListKeys are powerful functions that let us collect data into an object, to parse data from an object, or to edit the object itself. We can even format a JSON object using JSONFormatElements. Since JSON’s purpose is to exchange data, it seems these functions are good enough. We at Geist Interactive have combined these with other FileMaker functions to produce some custom functions (in JSONAdditions.fmp12)  that are useful in many circumstances. Our custom functions actually manipulate the JSON in some way, doing just a bit more than the native functions. In this post, we’ll take a look at a FileMaker filtering JSON function: JSON.FilterByExpression.

You can follow along with this download here.

Simple & straightforward

This is a simple custom function. Its purpose is to return the elements of an array that satisfies an expression. Here’s the syntax:

JSON.FilterByExpression( array ; expression)

You simply pass in it an array and an expression (in quotes), and this custom function returns those elements in the array that satisfy the expression.

The expression is the key. You write an expression that can be used to check a value of a key in each element of the array. Those elements that have have values that meet satisfy the expression are returned.

Let’s apply this custom function to an example, starting with the use case.

Use Case: Given an object returned from the client’s shopping cart website api, chart any number of years’ unit sales. Give the user the ability to choose the start and end year, and display the units sold for that year range on a chart.

Here’s the “sales” array returned from the api and placed into the field: shoppingCart::JSON

"sales" : [
"year" : 2011,
"units" : 43
"year" : 2012,
"units" : 22
"year" : 2013,
"units" : 100
"year" : 2014,
"units" : 2
"year" : 2015,
"units" : 12

JSON.FilterByExpression can easily parse through this array and return those elements that are within the boundary years.

JSON.FilterByExpression ( 
JSONGetElement (shoppingCart::JSON ; "sales")  ;  
"JSONGetElement ($item ; \"year\") ≥ 2012 and JSONGetElement ($item ; \"year\" ) ≤ 2015 

Will return

"year" : 2012,
"units" : 22
"year" : 2013,
"units" : 100
"year" : 2014,
"units" : 2
"year" : 2015,
"units" : 1

The FileMaker filtering JSON custom function is recursive. It will loop over all the elements in an array and construct a new array of those elements that agree with the expression

Express(ion) yourself

The goal of my custom function is to return those elements that have years between 2012 and 2015 inclusively. So that’s how I wrote the expression:

"JSONGetElement ($item ; \"year\") ≥ 2012 and JSONGetElement ($item ; \"year\" ) ≤ 2015"

Notice a few things:

  1. The FileMaker expression must be quoted.
  2. The expression will be applied to each element in the array
  3. The variable $item will contain current item in the array. (Kind of like the current record when looping records.)
  4. I’m using “JSONGetElement()” function to pull the ‘year’ value.
  5. Since the expression is quoted, I’m escaping the ‘year’ key.
  6. I’ve got two parts to my expression in this example. But any expression and any combination of expressions can work.

Here’s what my expression is saying:

For this object in $item, check to see if the value in the key “year” is greater than or equal to 2012 and less than or equal to 2015.

As this FileMaker filtering JSON custom function loops through all the elements in the array, it checks: Does the value in “year” satisfy this expression? If so, add the element to an array.

Finishing the use case

Once I’ve gotten back the array elements that satisfy the expression, I can use another custom function: JSON.GetValuesAtPath (array ; path) to return the “units” values. I’ve done this before in another charting exercise. We will look at this function in a later post.

The full calculation to pull filtered values from an array.

And thus, the chart is born. (Well, there’s a little more to the chart, but you can view that in the sample file).

filemaker json chart

The chart generated using the JSON.FIlterByExpression

The JSON Structure

We got an email of someone struggling with this custom function. It returned a “?”. After examining the person’s data structure, I realized what he gave us wouldn’t work. It looked like this:

{"2012": [
"value": "22"
"2013": [
"value": "100"

The structure, besides being valid though unnecessarily complex, doesn’t contain an array of objects that the custom function JSON.FilterByExpression can loop through. Here the year is the key to a one-element array of one JSON object containing one key/value pair. This is an object. It is not an array, so our custom functions wouldn’t work.

By the way, this makes me wonder: what is the ‘proper’ structure for JSON and data. I will have to explore that.

Our custom functions work with arrays. Specifically, the custom functions loop through the elements of the array and returns a new array or list.

But that’s okay. Our custom functions work with VERY COMMON JSON data structures. Often an api will return arrays within the entire object, just as we see in the working example above. If we can drill down and grab an array from the object, we can use these custom functions.

New tool: FileMaker Filtering JSON

It is wonderful that we now can natively work with FileMaker filtering JSON to gather records into an object, to parse api results, and to manipulate and configure the JSON. We’ve just added a few more we find useful.

Give our JSONAdditions file a try. There’s some useful custom functions that will further your ability to not only generate data into a JSON object, but also to filter, merge and otherwise manipulate an existing object.

If you do adopt these into your own solution, I recommend simply copying / pasting all the custom functions into your file; many of these need other custom functions.

Go forth, now and expand your FileMaker JSON toolset.

If you didn’t download it before, take a look at the sample file. It uses the chart object in FileMaker.

FMPerception is a ghost-hunter, and a successful one at that. The realtime developer intelligence tool can point out all the ‘phantom fields’ on a layout. Let’s take a look at FMPerception’s successful ghost hunt.

What are Phantom Fields?

Unlike the prey of Ghost Hunters, phantom fields are real. They are the fields that are on a table view.

The fields are added in the Modify Table view dialog.

 These fields are not on the layout in layout mode.

They haunt the layout, seen only in Table view. They are phantom fields!

One note: the term Phantom Fields doesn’t come from FileMaker. Instead, the term is how FMPerception defines these. Just so you know.

Giving up the ghost

FMPerception identifies these illusive phantom fields on any layout. 

While looking at a layout, I can view all the layout objects. Each one, if it is on the layout, has a region designation. Phantom fields have the region “Phantom Field”. So if I wish, I can remove those from the layout.

Do I need to get rid of these?

I would say no. These phantom fields are there for a reason. They show themselves only in table view, and if someone has the correct privileges, phantom fields can be added for custom views. With the same privilege set, these phantom fields can be removed from the layout.

FMPerception doesn’t have a report for Phantom Fields like it does for “Unreferenced Objects” or “Broken References”, since phantom fields are not detrimental to a custom app in any way. FMPerception simply identifies them for you on each layout.


So be aware of phantom fields on layouts that allow Table view. And be aware of the identification of these fields in FMPerception. They are of little consequence overall, and they allow quick access for users to create a custom view of the data.


I was working with someone from my local user group recently. He was dealing with a slow data import process. The Use Case called for any user to be able to import data from an excel file into the custom app and then work with that data. And it was slow. We took a look at it and talked through it together. A few hours later, the import was smooth and quick, and the data was available to all users shortly after the import finished. Let’s take a look at best practices for a FileMaker data import.

In this post, we’ll talk about the target table: the table that will receive the data from the FileMaker Data Import. In a future post, we’ll talk about the best practices of the actual import process: how to script it so any user can perform an import.

The Target Table

The first question we should ask: into which table will the data be imported?  It seems obvious: the table which holds the data. If I’m adding new students to a student information system, it seems I should directly import them into the Students table.

But that’s not always the case. There are a few reasons why we want to, at first, steer clear of the table that holds the data:

  • A table with many calculated fields will slow the import to a crawl. Even though the calc fields are not shown, the import forces the calcs to run.
  • The data might be wrong. It is possible to forget some columns or rearrange the columns during an import.
  • The data might need additional processing. Calculated fields shouldn’t process the data, but sometimes processing needs to happen.

So instead of the actual data table (importing students into the student table), FileMaker (and other system) developers will direct the FileMaker data import to happen in a temporary table.

A Temporary Table

The temporary table idea is worth exploring, and we’ll do a deep dive on it here.

A temp table has the following characteristics:

  • It is a table in your custom app or in another file.
  • The fields in this table are static fields: number, date, text fields. There are no calculated fields.
  • The fields in this table match the columns in the import source.
  • The scripting directs the import source to import the data here.

Since tables and fields are cheap, you can have one table per use case. If you have to import students and test scores, without issue, you can have one temp table for each.

The Advantages

Temporary tables simply accept the data and thus have great advantages.

Quick Import

Importing into a table with no calculated fields is quick. The data simply gets added to the table.

Post-Import Validation

Once the data is imported into a temporary table, it can be validated before it gets placed into the real table. And this is a good reason to use a temp table. We don’t want data to be imported that is incorrect.

User Error

A school administrator needs to import test scores into his student information system. These scores, of course, are scores of his current students using the students’ district IDs. Somehow the import gets messed up. The administrator accidentally disturbs the order of the import: instead of the “StudentID” column being imported into the matching field, this column gets imported into the scores field. A validation process could go through the temp table’s data to ensure that all studentIDs in that field match an existing student. If not, the record shouldn’t be placed into the “Scores” table.

Data Error

The data might be wrong. There might be missing fields or missing data, there might be incorrect data in the columns. A whole host of things could go wrong. Since we, as the developers of the system, may not control the source of the data, we have to make sure the data is error-free

In either case, we want to script the process to handle the validation of each column and each row.  Our scripted process could then generate a report  to show the importer which records were rejected and why.

Post-Import Processing

Validated data could further undergo some processing as it is being moved to the correct tables. Here are some possible processing:

  • Duplicates are removed.
  • Phone numbers, email addresses could be added to a Contacts table.
  • Scripted calculations could be made and the result placed into a static field.
  • Add new records

The amount of processing that can happen is unlimited.


It is possible for two users to want to import data into the same table at the same time. Maybe they’re importing the same data, maybe one is importing Test 1 scores, and the other is importing Test 2 scores.

In either case, there’s a strong case to be made for using a temporary table. Both users can import the data; the data doesn’t interfere with each other and can be processed in turn.

Often, users are importing their data in a session. As part of the process, a session record is created, and the user’s import is tied to that session. Sessions are good for reporting out validation errors. My script can record that Session 32 had six rejected records.

FileMaker Data Import: But I want it now!

From a user’s perspective, a FileMaker data import, in the manner described, seems like a lot. First the data is imported into a temp table. Then it is validated to some degree. Finally, it is processed into the correct tables.

Yep. The above descriptions do take some time. Users may balk at that. But let’s consider the alternatives:

  1. The data is imported into the actual tables where resides many calculated fields. The user sees the “Importing” dialog for many minutes. She can do no other work in FileMaker.
  2. The data is imported incorrectly into the actual tables. The test score column ends up in the studentID column.
  3. The data is imported into the actual tables,  but the data source contains extra columns that need to be placed into related tables.

Patience is a (common) virtue

When I was a teacher I also worked as the database administrator (two full-time jobs). Every year I’d have to set up our external services with student data. Those online services, from big testing and student-organization companies, would ask me to upload the file to a site and then WAIT for up to 24 hours until the data was processed. Inevitably I’d get a report back saying some records (or an entire import session) was rejected with clear reasons why it was rejected.

I think it is reasonable to ask users to wait a few minutes or hours for their imported data to show up in the custom app. I think they’d rather have the data validated and in the correct tables and fields than see wrong data instantly.

FileMaker to the rescue

Luckily we have tools available to us to validate and process the data quickly. We can use Server Side Schedules, Perform Script on Server (PSOS, pronounced P.S.O.S). We even have available to us JavaScript  or external microservices to help process the data in some way.

So we’re not out of luck. Users get to see their correct data sometime after their import.


The target table is an important consideration when designing an import process. It seems to me a temporary table is the right choice for all user-driven imports.

In the next post, we’ll take a look at other user-specific needs for an import process. Stay tuned.

Knowing how to use a FileMaker Join table ( though not exclusive to FileMaker) can solve many data-model related questions for us. As we fiddle with the Entity Relationship Diagram to work out the exact relationship of entities for our custom apps, we can rely on this join table technique to properly structure seemingly complex data models. Let’s take a look at how this concept is used in our work.

What does a FileMaker Join Table Solve?

The join table, as it is commonly called and known, solves the complex problem of how to structure the data model when the entities are related in a many-to-many style. A classic example of these entities includes classes and people. Many people have many classes. A join table joins these two entities together.

The Entity-Relationship Diagram model of a many-to-many relationship

The join table shows the data related correctly with no duplication of records. If I needed to set up a roster of students, and I didn’t use a join table, I’d have to duplicate either student or class records. For example, in the class table, I’d have to have one record (with the same class) per student. In the Classes table? That’s a problem. Classes is an entity: one record per unique class. I shouldn’t duplicate class records to just make room for all the people.

What is a Join Table?

Join tables are simple tables that hold the primary keys of the tables it is joining together. In the above example, a join table has a field holding the key for the Person record. It also contains a field for the key of the classes record.

The Join table, joining Students and Classes.

This table holds one record for every unique combination of the joining entities. In this case, it holds one record for each student and for each class they are in. If Jamie is in five classes, this table holds five records with Jamie’s primary key value and each one of her classes’ primary key values.

Many records for students taking many classes.

Naming a Join Table

It’s a small but vital point: naming a join table. It’s actually really difficult to name things, at least it seems so. Here are some naming options:

  • The unoriginal but descriptive and easy-to-generate name with the two joining tables concatenated: “People_Classes”. That works. It tells you what’s in the table. But “People_Classes” as a name does not describe its entity: what is in this table. What this table represents.
  • The hard-to-generate name that describes the purpose of the table, what the table is holding. Instead of “People_Classes” I’d use “Enrollment” or “Register” or something like that.

Pick a naming style. I think the 2nd choice is better. My future self will enjoy seeing “Enrollment” vs “People_Classes”, I think.

Viewing Records

Once the data is set up, the next step is to create the viewing area. A FileMaker join table gives us the freedom to view its contents (and thus the correct data) from both sides of the join table or from the table itself.

On a student record

From a student record, that is, from the Students table occurrence, we can see all of the classes in which she is enrolled. And we can do this through, as usual, a portal. But what from what table occurrence will the portal be drawn? The description above gives the answer: “. . . in which she is enrolled.” The portal on the student layout comes from the enrollment table. We’ve established that the enrollment table holds one record per class per student, so one student’s class list is found here.

However, as I described above, this enrollment table only contains foreign key fields. From where does the class name come? Simple: we add the class name field from the classes table through this relationship.

On the Classes Record

From the classes table occurrence, we can see all the students in each class. Again, a portal sourced from the Enrollment table is set up. And by adding the student name field into the portal, we see all the students in the class.

From the FileMaker Join Table

The join table is best used for a list, a report of the students in each class. Since we need the student name and the class name, we simply add these fields to the report layout.

Adding Records

So how are records added to this FileMaker join table?

Let’s take a typical Use Case. The principal of a school needs to enroll students in classes. She would go to the classes layout and, one at a time, add students to classes. (Forget the fact that this is terribly slow 🙂 ). So she starts on “Algebra” and needs to add students. So where is the new record created?

If you stop and think about it, there’s only one place: in the join table, I mean the “enrollment” table. In this table we create a record with the current class’s primary key and a student’s ID.

So then the script would follow these steps:

  1. Get the ID of the class in a variable.
  2. Get the ID of the selected student in some manner in a variable
  3. In the join table, Enrollment, create a new record.
  4. Set the foreign key fields with the variable data
  5. Close the window.
  6. Refresh the current layout.

The demo file attached shows my scripting method. Look at the Classes layout. Step 2 above is the most challenging of the steps (though it isn’t challenging at all).

I’ve also, on the Student layout, used another method for adding records to the join table.

The Problem Solver’s Data Structure

The Join table solves a complex problem for us. It allows our clients to see complex relationships in their data. And it is a high-level tool that gives us more power at our fingertips.

In this video, I illustrate the above post.

Download the sample file here.


The topic of FileMaker tests is important, to me and to clients. I want to make sure my scripts work, and my clients don’t want bugs. For DevCon 2018, I put together a session about this very topic in the hopes of sparking the conversation. The recorded sessions from DevCon are now available online. Here is a link to my session on Building Testable Applications.

Introducing a new topic

In preparing for this talk, I did a fair bit of research on software testing. It became obvious early on that software testing has a lot of depth & breadth, so I wouldn’t have time to cover any other frameworks in the session in one hour. I was also grappling with the fact that the FileMaker community seems to have limited discussion on testing.

I decided to aim my session on talking about what FileMaker tests can do and prove testing’s worth with some live examples. My talk easily could have spanned two hours, so there are a few places where I had to gloss over key topics. I barely touched on modular scripting, the reason to use custom functions, and pre-conditions for testing. We will address these at some point–they play a vital role in building FileMaker tests. My goal was to pique interest in testing in the FileMaker community.

Why test?

The first few minutes were focused on why we should write tests for complex business applications. Any developer that has implemented changes on a complex system already knows the answer: we need to STOP BREAKING CODE!

Fair enough…on to the hard part. How?

Factors to Consider

As you design a testable custom app, you need to consider these factors and use these tools. At first, these ideas might seem foreign, but the more you work with them and engage with them, the easier it will be to work with these in the future.

Designing a system so it CAN be tested

Modular scripting is the vital first step to creating testable applications. If you have a piece of executable logic that should accomplish a specific function, you need to test that one function. However, if you have “spaghetti” code, functions are often times occurring in multiple locations, operating from a different context, or performing multiple discrete pieces of logic in one script. Instead, make your scripts modular and stay DRY. That’s the first step to enabling testing.


Every discreet piece of logic should not only accept a script parameter that is JSON, but it should return JSON as well. Error objects come back if there was a problem and a result object comes back if everything worked. This means you have a modular script that can be given a payload and return results. And JSON is the way to go. Trying to build a complex payload using anything other than JSON introduces you to a world of hurt. You’ll be working with a lot of data and will have to find some way to keep it organized.

Custom Functions

Geist Interactive has released custom functions on Github. We have repositories for handling JSON validation and errors, and analyzing test results. Even if you aren’t adopting testing yet, go get the other custom functions now. It’s amazingly helpful to handle parameters in a meaningful way.

My examples

The following examples contain simple-to-advanced examples of what we mean by testing. Take a look at each one and pick out the above factors in each example.

The simple one

My first example was as simple as anything could be. I had a script that multiplied a number by 2. Amazing, right? This file isn’t about what it does, it’s all about how it does it. If you follow along in the video, you’ll get a good sample of how to separate scripts based on Interface scripts & Controller scripts. The controller script is testable, the interface is just what calls on the logic.

Purchase order sample

I released a purchase order example that is probably the best thing to take from my session. The file is a good dive into a simple solution that’s pretty intuitive. You just need to turn on script debugger and follow the code. You’ll learn in no time how we manage to create records using a controller.

Once you’ve got a grasp on that, open up the test file, and start writing your own tests. I dare say it is the best learning tool so far! I’d recommend imagining customer requests, changing the code, and writing tests. Then change the business logic, write more FileMaker tests, and you’ll get a feel for how testing can (or can’t) solve problems for you. As your solution gets more complex, you’ll need more tests, but it doesn’t really get more difficult. If you can implement a change to your code, and write tests to support it, you’re good!

We practice what we preach

Along with the custom functions and our work on modular FileMaker, we put this testing concept into practice.

Karbon Tests

As most people have heard by now, Geist Interactive released Karbon at DevCon. This is the most complex solution that is publicly available with tests integrated. If you’ve mastered the Purchase Order Example, feel free to give it a shot here as well.

Testing Generator

Writing tests can be monotonous. You are declaring a case, subject, payload, script, and Assert functions for each test. Where code is predictable, we don’t write it, we generate it. I released a generator file as well, that can generate a template for your testable scripts. Feedback welcome!

Long live FileMaker Tests

Since DevCon, I’m a few inquiries about testing. There are a few folks out there that are interested. I hope more folks adopt a testing model for their custom apps.

It might take some time before everyone starts using this, but I’m convinced our community will become more interested in testing over time. One issue is that testing has pre-conditions in your solution, so if scripting isn’t modular, you aren’t ready to start testing. I’ll keep telling myself that’s why I’m not getting questions. Everyone is just re-factoring code to be modular…that’s it!

What’s next

There are some interesting opportunities for test generation. We released Karbon_Generator, which is a developer tool to generate Controller Scripts. Because there are things known about your code, we should be able to produce scripts to test any other scripts. There’s a lot to iron out, but maybe by DevCon next year!

The FileMaker Web Viewer Bridge raises our interaction between FileMaker and JavaScript to the next level. Let’s take a high-level view of this framework.

Working with the web viewer, stage one

In all the work I’ve done so far with JavaScript and FileMaker, I’ve followed one setup. The JavaScript libraries, functions, the CSS, and the HTML are all stored in FileMaker fields. These fields’ contents are calculated together into a single calc field, and this is the field a web viewer uses as its source. For much of the work I’ve done, this method is sufficient. I can display a date picker or a data tables list of actual data from the custom app using this method. It worked.

The Problems

When I want to do more, this method causes a few problems. If I’m using the DataTables JavaScript library to show a portal-looking list of related data and a new record gets added to the related set, I’d have to make sure to add that data to the web viewer’s view. And that requires reloading all the JS libraries into the web viewer. The screen cap below shows the problem.

Likewise, it is a problem when the use case requires a button to make a change to how the web viewer shows its information. If I have a chart, and I want the chart to transform from a bar chart to a line chart, I’d have to update the chart options from “bar” to “line” in the JS function text field (which I could do using Filemaker Scripting).

Both of these possible use cases cause a problem to the experience of using the web viewer. Watch the movie below showing the DataTables JavaScript library integration and see if you spot the problem.

Let’s breakdown what happened

  1. This JavaScript configuration was set up to sort by last name Desc by default. Every time this web viewer loads, those records with last name starting with “Z” shows up at the top.
  2. I sorted by Last Name Ascending.
  3. I added a new related record to the table using a popover, global fields, and a script.
  4. When I pressed “Add Row”, a script ran to grab the related data of this one parent record. The script formatted the data properly and then set the data in the Data field.
  5. This field refresh triggered the web viewer to reload.
  6. When the web viewer reloaded, it took the calculated HTML document and reloaded it into the web viewer, causing the flash and causing the sort state to be reset to its default for this configuration.

Bridging FileMaker and JavaScript

The FileMaker Web Viewer Bridge solves the problems described above. This framework bridges the FileMaker platform and any JS libraries you are using, forming, as most bridges do, a two-way connection. FileMaker can ‘talk’ to the JS loaded into a web viewer, and the web viewer can call back to FileMaker.

Here’s what it looks like.

Did you see it? Let’s breakdown what did and didn’t happen.

  1. I sorted the data using both position and Last Name (yes, that’s built into DataTables).
  2. I used a FileMaker popover, global fields, and a FileMaker script to add a new record.
  3. When the record was added:
    1. The new row immediately appeared
    2. The sort state did not change
    3. The record was added to the rows, there was no flash.

We still loaded the new record into the web viewer, but we did so without resetting the web viewer. That’s the power of the FileMaker Web Viewer Bridge. The popover I used to add the record contained a button that ran a FileMaker script. That FileMaker script sent the new record’s data to the JavaScript library, calling a function in DataTables JS to add a row.

Calling back to FileMaker

Just as we can use FileMaker to call a JavaScript function, the FileMaker Web Viewer Bridge allows a JavaScript function to call a FileMaker script. This could already happen using the fmp protocol, and indeed the Web Viewer Bridge uses the FMP protocol. The complex syntax of the statement is removed.

The Devil’s in the details

We’ll go into detail about how you can use the FileMaker Web Viewer Bridge to accomplish the above tasks for any JS library in future posts and videos. There’s some specific points to cover. We cover the following topics:

  • Setting up the JavaScript to run FileMaker Scripts
  • Setting up the JavaScript to allow FileMaker to call a JS function.
  • Loading all HTML / CSS / JS into the web viewer.
  • Connecting to the FileMaker Web Viewer Bridge
  • Sending data created in FileMaker to a JavaScript function
  • Calling back from JavaScript to FileMaker

There will probably be other topics along the way.

It’s for Everyone

The power of the FileMaker Web Viewer bridge, all of the possibility it opens up, is not beyond your skill as a FileMaker developer and a budding JavaScript developer. Yes, you’ll have to learn some JavaScript, but we’ll cover that together and in minute detail.

Download the sample shown in the screen casts above as well as the original FileMaker Web Viewer Bridge file. Read through the documentation (found there on Github) and stay tuned here to learn more about the FileMaker Web Viewer Bridge.

TODAY  – 10/22/2018  – 6:30 PM Pacfic!

Next Monday ( October 22 ) Geist Interactive will be hosting the San Diego FileMaker User Group at our office in San Diego, if you don’t live in the area and still want to attend, you can join us via Facebook Live on our Facebook page.

We are going to attempt something interesting.  This is not just a webinar, or Go To Meeting. This a multi-camera Live Streamed event. We’ll have a  “director”, who will be able to switch between multiple camera angles and the presenter’s screen to provide the best view of what’s happening at the event. If everything goes well it’ll be like a TV show with a Live Studio Audience.  We’ll also take questions from the online crowd via Facebook Live.

Last Of the West Coast Tour

This is the final leg of our 2018 West Coast Tour, where we have been talking about Power Tools for The FileMaker Workplace Innovation Platform.  Here are some of the topics we have covered in the earlier shows.

  • JavaScript Intro and mini-workshop
  • Karbon – our free Ambitious App Starter.
  • Otto –  Fully automated data migrations
  • LedgerLink – Connect to Quickbooks from FileMaker

Watch the Live Stream

The Live Stream will start at 6:30 PM Pacific Standard Time. October 22nd, 2018. Join us on our Facebook page.

Attend in Person

If you are in the area please come in and be part of our studio audience. We gather for food and drink starting 5:30 PM at our office in San Diego.  The presentations will begin at 6:30 PM


Geist Interactive
1855 1st Ave Suite 103,
San Diego, CA 92101

Integrating JavaScript libraries is becoming an obsession of mine. I spend my half free time searching for new integrations that might work in FileMaker, and the other half implementing those. I look for JS libraries that are accessible and ones that provide functionality that might be used in a custom app. I think that this next one, a FileMaker Audio Player can be very useful. Let’s take a look how to integrate this simple library.

Feel free to download the sample files here.

This video describes how I set this up. It’s shorter, about 35 minutes. As is usual, I make some mistakes along the way.

Key Points

Setting up the FileMaker audio player

Just like the other videos, we set up an HTML table in the normal way. We’ve got fields that will hold each text file: the HTML, the CSS, and the JavaScript files.

It’s common for me to do this since this allows the JavaScript to work offline. All the code needed for the library, in this case the FileMaker Audio Player, to work is in a record in a table. The web viewer can work with the text in each field.

Also, this structure closely resembles an actual web page construction. A web designer will create one file for each CSS or JavaScript library. Then, in the HTML document, she will link those together. We’re doing roughly the same thing here. We’ve got placeholder text in the HTML field. FileMaker’s calculation engine will, in the HTML_Calc field, do the work of placing the code inside the HTML page at runtime. And the web viewer will work.

The placeholder text set up resembles how web designers structure their page.

The placeholder text set up resembles how web designers structure their page.

There’s another way to work with libraries: our Web Viewer Bridge set up. That’s a free tool you can use. We’ll study it in an upcoming blog and video (more likely a few of those).

The Audio

The FileMaker Audio player can play an actual audio file that is stored in a container field. The library plays it in the web viewer. The web viewer, however, needs the audio file in base64 encoded form, so we have to base64 encode the file in another field or on the fly.

In the video I use the old FileMaker function Base64Encode. This works, but it places a return character after so many characters. And those characters mess up the audio file. I should have used the new one: Base64EncodeRFC which accepts a parameter in the form of a number telling the function to not insert any return characters.

Once the audio is encoded, it is placed in the Data field and used in the JS_Function field in a function there.

The possibilities

This library does one thing: it plays audio showing that audio’s wave form in a web viewer. It does have a few possibilities. Here are some of the options you can update:

  • The color of the wave form
  • The color of the progress wave form
  • The color of the current-time indicator
  • The height of the wave form
  • The playback speed

There’s a lot you can do with this.

Additionally, you can add buttons that do more, that control the wave form. In my final example, I added a button to change the playback speed and a slider to adjust the volume. These two are in addition to the standard play/pause.

Use your JavaScript skills

Finally, I put together JavaScript functions to make the user experience better. For example, the button that plays or pauses the audio has text that changes. If the music is not playing, the button says “Play”. When I click on the button and the music begins to play, the button’s text changes to “Pause”.

This type of functionality can only be done by actually using JavaScript to change the text. During DevCon 2018, we spent some time learning pur JavaScript. Though it wasn’t as exciting as the integrations, learning JS is essential to actually using the libraries. I couldn’t change the text from “play” to “pause” without knowing some JavaScript.

Try out the FileMaker Audio Player

Give this one a try. Even if you’ve no immediate use for it, it’s worth the time to practice your integration skills.

The Workplace Innovation Platform we know, called FileMaker, is exactly that: A platform. There’s many components in the platform. As we add those components to support all the clients’ needs in the custom app, we have to take special care about each. One thought to keep in mind is the script step compatibility. We have to make sure each step in each works in the platform in which we intend. So let’s take a look at FileMaker script compatibility.

FileMaker Go, FileMaker Pro Advanced (on macOS and Windows), FileMaker WebDirect, FileMaker Server, and the FileMaker Data API work very similarly. But there are some differences whether or not script steps work. Some steps are not supported in a part of the platform, and some steps are partially supported. It would be mind-boggling to remember which steps work where, so the script workspace gives us a tool to use.

Many of my script steps do not work for the FileMaker Data API component.

At the top right of the script workspace is a button that opens this dropdown. We can use it to examine a current script to see which steps are not compatible with the selected component. You’ll see any incompatible script steps grayed out.

Additionally, we can use this dropdown to examine the script step list to see which steps are compatible.

So the lesson is: when we are writing a script, we should use the FileMaker script compatibility checker to make sure each step will work. We need to design scripts that are the best for each part of the platform. We need to review existing scripts to make sure each step will perform correctly in the chosen components.

Scripting smarter

Scripting compatibility sounds like a lot of work. Either I’ll have to create one script per component to do the same thing, or I’ll have to have a lot of logic inside a script to handle all the possible places a script will be run. But it really isn’t too hard. And if you think about it, what’s worse: more work or incorrectly-performed scripts?

Here are some strategies you can use to ensure you’re scripting smarter: efficiently and effectively.

Get to know the components

Many seasoned developers have an encyclopedic knowledge of the entire platform, and that includes knowing which steps are compatible with a part of the platform. This simply comes with experience. I don’t think it’s that these developers know all the compatible steps for, say, FileMaker server, but they know what FileMaker Server can and cannot do.

For example, we know two ways to run scripts with FileMaker Server: Perform Script on Server or Schedule Scripts. Each of these uses the scripting engine inside Server. It opens an instance the custom app only in memory. No UI is drawn. So we have to consider what this means. There’s no window, so the Move/Resize Window step is useless. That’s why it’s not compatible with server. (if you do use this step on server, nothing bad will happen. We’ll discuss its consequences further down).

Likewise, the script step Get Directory does not work in FileMaker WebDirect. We know this because that component does not have access to a client’s file system. FileMaker developers in the game for a long time know this.

One caveat

FileMaker, Inc. continually updates their platform’s functionality, and thus FileMaker script compatibility changes. Script steps that were not compatible in a part of the platform in the past are now compatible. The script step “Save Records as PDF”, back in the old days of FileMaker 15 and earlier, was not compatible on server. Since FileMaker 16, it has partial support.

Once you know how a component of the platform performs or works, you can more efficiently pick those steps that will work and work around those steps that you need that won’t work (if possible).

Write Component-targeted scripts

This strategy involves writing a script for each component that will be used. If I need a process to run using the FileMaker Data API, I should write a script just for that process and component. Even if that same process is used in FileMaker on Windows, there is a benefit to having two separate scripts that do the same thing (roughly) that are customized to that component’s script compatibility.

My script workspace might have folders for each component:

  • Go
  • API
  • WebDirect

and so forth that hold specific scripts.

To start, I’d build a script that works completely on the most common component used in my custom app. I might start with FileMaker Pro Advanced on macOS. I’d build the script, duplicate it, and adjust it for the Data API as necessary.

In-Script Logic

Another strategy to use when working with FileMaker script compatibility is to use logic in your scripts that skip over incompatible script steps for any component. This involves checking for the component that is running the script, and then, at each step in question, skipping over it or performing a different set of steps that do work for that component. The functions Get(SystemPlatform) and Get(Device) are good candidates for component-checking. There are also custom functions that provide this functionality as well. _IsWin will return 1 if a Windows machine is running the script.

The Consequences

When the FileMaker runs a script, it will skip over any steps incompatible with that component. Sometimes that isn’t a problem. Show Custom Dialog is not compatible on FileMaker Server, and Server will skip that step. That’s okay if the script step was meant to show a success message. If, on the other hand, the step included input fields which are used in the rest of the script, there’s a problem. So you have to consider each incompatible script step and decide if the script will break if the process is incompatible.

What is ‘Partial’?

There are quite a few script steps that have listed “partial” in the supported column. Save Records as PDF is one of those. It is partially supported in FileMaker Go, FileMaker WebDirect, and FileMaker Server & Cloud. It seems odd. Why would a step be partially supported? As with anyone’s enjoyment of brussel sprouts, support seems binary. No one ‘partially’ enjoys the gross-smelling green sphere.

Well it turns out FileMaker can support some parts of some script steps. These steps happen to be ones with many options. The Script Workspace is helpful in showing which parts are supported and which are not.

When we encounter a partially-supported step, we can use FileMaker’s documentation to review the partial stuff. (By the way, did you know you can get to the documentation by right-clicking on a step in the Steps menu and choosing “Help”? That’s cool.) The Partial information is found in the notes.

In this case, Save Records as PDF is supported, but the dialog will not show up when this step is run from FileMaker Server. So it is wise to review the notes for partially-supported steps.

FileMaker Script Compatibility: Write scripts with all the tools

FileMaker Script Compatibility is an essential part of every developer’s skill-set. Whether she knows how each of the components works and their functionality limitations or she uses the compatibility checker, it is vital that each script runs successfully and does what it intends to do in every part of the platform.

We need to do better logging in Karbon, both developer logging at design time, and process logging at runtime. This video covers some of the progress we have made towards solidifying our approach to logging.

Why not just use DBTransactions?

It is tempting to use the DBTransactions table as our log, and in fact, it includes many log-like features. However, DBTransactions is really about database transactions, not logging. Logging is really a separate thing, which has many nuances of its own. Conflating the two concerns is probably not wise in the long run.

Logging from within Transaction

The other problem is that you want to log info from within a transaction and make sure that data stays even if the transaction is reverted. So step 1 is to develop a log method for logging within the transaction. That “transaction log” can later be sent of to the main log, which we’ll get to later.

This feature is not yet “done”. More work needs to go into generic logging, and process logging. Stay tuned for that.

Karbon Info

Karbon is our free framework for building complex apps. We share rough updates like this from time to time because people have expressed interest in what we are doing. For more info on Karbon see here.