The International Competition on Knowledge Engineering for Planning
and Scheduling (ICKEPS) is a bi-annual event, hosted at the International
Conference on Automated Planning and Scheduling (ICAPS). The objectives of
the competition are to promote tools for knowledge aquisition and
domain modeling, to
accelerate knowledge engineering research in AI, and to encourage the
development of software platforms that promise more rapid, accessible,
and effective ways to construct reliable and efficient systems.
Knowledge Engineering (KE) for AI Planning has been defined as "the
process that deals with the acquisition, validation and maintenance of
planning domain models, and the selection and optimization of
appropriate planning machinery to work on them. Hence, KE processes
support the planning process: they comprise all of the off-line,
knowledge-based aspects of planning that are to do with the
application being built, and any on-line processes that cause changes
in the planner's domain model". We expect the competition to encourage
the development of tools across the whole KE area including domain
modeling, heuristic acquisition, planner-domain matching, and domain
The workshop time table is available.
The 2nd ICKEPS is hosted at ICAPS-2007. It builds on the previous
competition, held in
Based on the small number of competitors
we decided ICKEPS to take place as a showcase event, illustrating the
state-of-the-art in knowledge engineering tools, different sorts of
simulators and a discussion on how far current technology is from
Area of Scope
The competition is open to authors of a tool, an integrated tools environment or tools platform (below we simply refer to the entry as "the tool") where the tool helps knowledge engineering for AI P&S purposes in at least one of the following categories:
Stand-alone planners/schedulers are not eligible, but can be a part of a tool. The competition entries should be distributable without any associated fees; the competition is otherwise open to all participants. The authors should follow the competition structure. In particular, at least one of the authors must present the system during the conference.
- knowledge formulation (the acquisition and encoding of domain structure and/or control heuristics)
- planner configuration (fusing application knowledge with a P&S engine)
- validation of the domain model (e.g. using visualization, analysis, re-formulation) or validation of the P&S system as a whole (e.g. using plan/schedule visualization)
- knowledge refinement and maintenance (e.g. through automated learning/training, or a mixed initiative P&S process)
Phase 1: Pre-conference
Authors must submit a short paper describing the tool (no more than 5 pages including screenshots) to the workshop organizers by the submission deadline.
Papers must be in US letter size, and use the AAAI Style template.
Authors are encouraged to make the tool available for download from the web prior to the competition in order for judges to evaluate tools independently prior to the conference. All tools will ultimately be posted on the ICKEP web page.
The short papers will be 'lightly' reviewed in order to
If the tool is accepted for the competition then the authors must submit a camera-ready copy of the paper to the workshop organizers by the camera-ready copy deadline and at least one author must register for the competition via the standard ICAPS registration.
- provide feedback to the contributors for updating their paper and matters to explain and address during the conference
- contribute to the overall evaluation of the submissions
Phase 2: Evaluation through Simulation
In addition to the pre-conference paper submission, the competition will make available planning and scheduling simulations that competitors will use to evaluate their tools. These simulations will be available via a web service. Competitors will read a short text description of the competition domain, including a description of the simulation API, retrieve problem instances, submit plans for each instance, and receive feedback describing the quality of the plan. Each participant.s interaction with simulators will be logged, and the logs will be used as part of the tool evaluation.
The simulation server
is up. Simulations are available to everybody.
The simulation tester helps
connecting to the server.
Tools may evaluate their systems prior to the conference.
Phase 3: At the conference
At least one author per tool must come to the conference and demonstrate the system in person. This will entail giving a short talk during the competition workshop and being ready to demonstrate the system and answer questions during the demonstration session. The competitors must bring their tools installed on their own laptops. Competitors should tailor their presentations to discuss how their tools helped them solve the simulated domains, but should also discuss other facets of their tools not exercised by the simulations.
The competition will take place during a half day workshop, where competitors will present their systems, and system demonstration of simulation providers. The workshop and system demonstration will be open to everyone attending the conference.
The judges will decide the results using the reviews of the short papers, quantitative results from the competitors, interaction with the simulations, and the system presentations and demonstrations. The proposed criteria to be used by the judges are presented below; the final rules will be posted on the competition web page. There will be a presentation of the results and winners' prizes either during the conference reception or during/before the conference dinner. Final scores for each entry will also be posted on the competition web page after the competition.
Phase 4: After the conference
The simulation server will be accessible as a long-term challenge to the
Due to USB storage device the publishing time table for compiling
the competition proceedings has been postponed. This gives room to
include late coming showcases.
- Submission deadline for competitors: passed
- Submission deadline for further showcases: August, 1
- Camera-ready copy due date: August, 10
(all-including PDF due date: August, 15)
- Simulations: available
- Workshop event: September, 22 (morning)
The competition was organized
by Stefan Edelkamp,
University of Bremen, and
Jeremy Frank, NASA
- Stefan Edelkamp, University of Dortmund (co-chair),
- Jeremy Frank, NASA Ames, U.S.A. (co-chair)
- Wheeler Ruml, Palo Alto Research Center (PARC), U.S.A.
- Hector Geffner, Departamento de Tecnologia UPF, Spain
- Robert Hawkins, Space Telescope Science Institute, Baltimore, U.S.A.
- Roman Barták, Charles University, Czech Republic
- Lee McCluskey, University of Huddersfield, United Kingdom
- Robert P. Goldman, SIFT, U.S.A.
Comments on any part of the contents of this page are welcome. Send us