Does FMP 18 Evaluate() Handle More Complex Expressions?

Yeah, I can at least partly agree with that. I often find it beneficial when FM tells me something is off…since that usually means I missed entering something that was supposed to be there.

However, I won’t argue against the benefits of logical completion of the expression. There are benefits to that, and times I wish FM would just show me what it could calculate.

The accounting side of me says I want to control it explicitly…I don’t trust anyone else. LOL

1 Like

Yep…What I was saying above is that I DO control that code explicitly. Not FMI.

Thanks J. I’m glad your here on this forum. You always share interesting perspectives. :slight_smile:

1 Like

There is going to be a point at which “robust” happily produces the wrong output because the input was erroneous. I know that is possible in FileMaker too, I don’t need examples I have seen it often enough. Having a very finicky linting engine means that I often get shown errors in my work that I missed. I like it.


Hi Malcolm,

I disagree. Robust means solid.

Plus, the user, not the developer, is the one who suffers with constant "?"s with no helpful error info.

I’ve tested my ‘evaluate()’ with hundreds of inputs and so far it’s rock solid.

If you want to send me a few hundred (or thousand) arithmetic evaluation examples, I’ll run them through and we can compare results. The REST service will complete in a fraction of the time FMP would take in a loop, also.

Make sure to include a bunch with unbalanced parens as FMP won’t even bother with those.


I agree with your definition of robust. I disagree with the application of that definition to FileMaker. The math engine is one area where FM is not as forgiving. Like @Malcolm, I actually prefer to see the error so I can make sure it’s doing what I expect.

FileMaker’s strength still remains, because we have the tools to make it do pretty much exactly what we want. Here is an example file showing a simple way to handle the error, and not show the dreaded “?” for your scenario.

Granted, this implementation is rather simple, and itself not as robust as it could be…but provides a good proof-of-concept for showing useful errors. After all, this is what would happen in almost any other language. You would handle errors, and provide usable feedback to the user, or make decisions about what you think the user was trying to do. So in your case, you may pass this off to your microservice to just power through the calculation. To my point, this is not always the desired behavior, so I can’t say this is a failure on FM’s part. It is just a different result.

CalculationEvaluation.fmp12 (308 KB)

1 Like

Still stuck on this point that keeps being repeated. To wit, since I wrote the 450 lines of relatively complex code, I don’t see how I could be in … more control over what’s happening. AFAIK, FMP doesn’t make the source code of the Evaluate() function available for modification.

Also, each time I make a code change, I have a JUnit test do regression testing on the hundred or so examples to make sure I didn’t break anything.

Plus the users get very helpful detailed messages. (See below). In my earlier screenshot above, I forgot to include transcendental functions’ results (FMP handles these too, of course).

In any case, for me, it’s a moot point since most of my clients have never heard of FileMaker. My code works with any HTTP-enabled application, not just FileMaker. My clients just want an endpoint (FileMaker not required).

The updated first example (called from FMP) below shows that the logic still works even with mismatched parentheses. The last example gives the user a plain English thing to check with that they entered. FileMaker just gives “3” in this case as the result. That’s just a different design decision.


The variance here is use-case and design pattern differences.

For a user-facing expression calculator, we can show them what we want. Design decision say “how do I want this to work for the UX?”

For stuff that is not a user-facing calculator, it is just not a problem. Since the dev is in full control of the calculation.

For an end-point and API, simply not using FM for that anyway.

The key being what your clients need. That makes this a problem that FM doesn’t need to solve. So we are back to design choices and customer needs. Not some failing of one or the other.

1 Like

That doesn’t address any of my points, but whatever. :slight_smile:

(The apparent user calculator was just an example showing functionality, not really how it would be used by production clients.)

How do I modify the way a FileMaker function behaves (the result expression, the assumptions made for certain data inputs, the “?”, etc.)?

Let’s talk about the real use-case then. My users, at least for a the time I would consider myself a professional developer, have never seen a ? unless it was an issue with the field being too small.

Not being argumentative…I’ve truly interested in your viewpoint.

1 Like

Could it be that, because you use a lot of open source technology, you have develop this vision that you should be able to modify native function?
I am not aware of many not open source software (but again, I’m not a very geeky savant here) that allows you to modify those or tell you exactly how they are built to give you the understanding of how they behave…
I think the way you can do that is by using plugins (how do those figure out how to interact with native is beyond my pay grade).

Hi Cecile,

Micro-services are much better than plug-ins in most cases since they’re not limited to FMP only and you control the code. :slight_smile:

Plug-ins are compiled and, like a regular FMP function, you still have to live with whatever the developer did and hope they will respond to bug reports and any other problems. I’ve heard mixed reviews about some FMP plug-in developers’ responsiveness with bug fixes. Plus, plug-ins can also be quite expensive. Local version. Server Version. Unlimited version. Get that credit card ready…

Microservices are also compiled (wickedly quick), but you control the code, not a third party. Any change you make to a micro-service (unless you don’t want this behavior, which, of course you can change) is automatically and immediately available to all callers: one of them or ten thousand. Again, all for FREE. They can run locally, on the network, or anywhere on the Internet.

The downside is that coding a micro-service requires a relatively small learning curve, but REST services are easy in general. The PostFix example in this thread is an exception to that rule.


Josh, I know you’re not being argumentative. :slight_smile:

You help make any forum better. You respond, you give alternate points of view, you post example solutions, etc. Really excellent! Thank you for all you do.

The real-world use-case here is that some arithmetic (and other) expressions are gleaned from extremely huge text/other documents (several GB) and eval’ed in machine learning contexts. Can’t go much beyond that.

My initial posting with the FMP eval example in this thread was to help a FMP forum visitor back in October of last year.

Completely understood. Then, yes, I understand why FM isn’t the best option for evaluating the calcs. The math engine is much to rigid for that.

Appreciate your reply. Thank you.

If you want to send me a few hundred (or thousand) arithmetic evaluation examples, I’ll run them through and we can compare results. The REST service will complete in a fraction of the time FMP would take in a loop, also, make sure to include a bunch with unbalanced parens as FMP won’t even bother with those.


Let’s not do that. We both know where it’s going to end up.


Nice sample. I’m glad that it’s possible to get the evaluation error but it is a shame that it isn’t easier to obtain. Can it be discovered if the calculation has not been through the evaluate functions?

The short answer is yes. However, as we know, FM won’t allow us to get away with things like unbalanced parens. So the number of errors you would run into are very limited. One example that it will work with:

Sales::totalPrice / Sales::quantity

This is a valid expression. So FM accepts it as such. However, if either of those field values is zero, it will return a “?”. EvaluationError would catch the error.

The one other nuance is that I believe EvaluationError ( ) needs to be wrapped directly around the expression, or the Evaluate ( ) function to catch the error.

My logic, by default, returns…

“Infinity” as a result really isn’t correct since any number divided by zero is "undefined’ (As we learned in first semester calculus, the division result is only “infinity” (which is not a number) in the limit as denominator approaches zero), but this is the way most calculators express this result anyway.

(assuming x > 0)

lim (x/y) = ∞

Again, you never EVER give a user a “?” or similar “we have no idea what’s going on” message. That type of result does not inspire confidence. IMHO, it’s a sign of a poorly designed application or function.

(See “About Face” written by Alan Cooper, father of Visual Basic for more info.)

“0/0” is another case you have to consider. (L'Hôpital's rule - Wikipedia)

FileMaker gives you this…

Customer level production logic gives you this:

My only goal was to have an objective basis for comparison. :slight_smile:

RE: unbalanced parentheses.

IMH(umble)O any self-respecting linter or code editor should validate balanced brackets and parentheses. I personally like that FM doesn’t allow the imbalance.

Runtime errors are welcome in my book too. JavaScript, Python, and C++ also throw syntax/compilation errors on unbalanced brackets.

Inferences are fine in cases where the rules are widely-known or well-documented. E.g. the order of operations as it applies to math and boolean ops. Programming languages have to give developers some benefit of the doubt.

However non-standard inferences don’t really make a language more robust, rather they make it more fragile and susceptible to developer mistakes. Strictness is invaluable, and if it weren’t, supersets like TypeScript wouldn’t have been created. Isn’t it better to catch errors and validate developers’ intentions before compilation/runtime?