To prevent spam users, you can only post on this forum after registration, which is by invitation. If you want to post on the forum, please send me a mail (h DOT m DOT w DOT verbeek AT tue DOT nl) and I'll send you an invitation in return for an account.
Declare Conformance Checker
Hi community,
I recently started playing around with ProM and especially with the Declare related tools. I used the Declare Maps Miner plugin to discover a Declare model from event log data. Afterwards, I wanted to test the quality of the mined model using the Declare Replayer plugin. The log I used contained 10000 traces with just 3 different event classes (I tried to attach a result screenshot but I only got the message "The file failed to upload."). The result is strange in two ways:
I would be glad if anybody could help me. I suppose it is just a misconfiguration or something similar.
Thank you very much in advance,
LaAck
I recently started playing around with ProM and especially with the Declare related tools. I used the Declare Maps Miner plugin to discover a Declare model from event log data. Afterwards, I wanted to test the quality of the mined model using the Declare Replayer plugin. The log I used contained 10000 traces with just 3 different event classes (I tried to attach a result screenshot but I only got the message "The file failed to upload."). The result is strange in two ways:
- considering the paper http://wwwis.win.tue.nl/~wvdaalst/publications/z12.pdf it is possible to measure fitness, appropriateness and generalization. However, in my case only the fitness is calculated.
- The second issue is that even the fitness is not calculated correctly. I used a model which was "underfeeded", i.e. I used only a few traces for discovery. However, using this Replayer plugin forces the user to map event classes from log to model. But it seems that there are only event classes displayed that are available in both, the log and the model. If an event class, let's call it class D, is not covered by the model, the fitness does not decrease.
I would be glad if anybody could help me. I suppose it is just a misconfiguration or something similar.
Thank you very much in advance,
LaAck
Comments
-
Dear LaAck,
I have been the main developer of the Declare Conformance Checker.
As far as the first question is concerned, I have checked myself and, indeed, the latest version where precision and generalization are computed is missing. I am not sure why. As soon as possible, I will investigate what has happened. It seems that the code has reverted to an old version where only fitness is computed. I will let you know about this as soon as I have time to work on it.
In general, if an event belongs to a class that is not mapped to any Declare activity, this does not mean that that event will cause fitness decrease. This is linked to the "open assumption" that everything is allowed as long as it is not explicitly forbidden. A declare model allows for activities that are not in the models, provided that they do not violate any constraints in the model. For example, if the model consists of a constraint "A chain-succession B", A must be immediately followed by B and not by any other activity. However, if the constrain is "A succession B", A must be eventually followed by B, with any activity being possible between the execution of A and B. "Any activity" also includes any activity that is not in the model.
Any activities not in the model is mapped to "tick": the special symbol similar to "v" in, e.g., Definition 1 of the paper that you mention.
So, with reference to your second question, yes, it is true that the fitness does not necessarily decrease if an event occurs for some activities that are not part of the model.
Let me know you have any more question.
Regards,
Massimiliano de Leoni
-
Dear Mr. de Leoni,
thank you very much for your helpful reply!
I would be glad if you could contact me again, once you know whether it will be possible again to measure precision and generalization of Declare models.
Regarding my second question, you've absolutely got a point there. My mistake was to assume that, even if an activity is not restricted by any constraint, it must be still modeled. But I absolutely understand that this does not comply with the open assumption, you've mentioned. So, by implication, this would mean that I rather have to look for a decrease regarding precision.
Thank you very much again!
Regards from Germany,
Lars Ackermann
Howdy, Stranger!
Categories
- 1.6K All Categories
- 45 Announcements / News
- 225 Process Mining
- 6 - BPI Challenge 2020
- 9 - BPI Challenge 2019
- 24 - BPI Challenge 2018
- 27 - BPI Challenge 2017
- 8 - BPI Challenge 2016
- 68 Research
- 1K ProM 6
- 394 - Usage
- 288 - Development
- 9 RapidProM
- 1 - Usage
- 7 - Development
- 54 ProM5
- 19 - Usage
- 187 Event Logs
- 32 - ProMimport
- 75 - XESame