Competition 2006

View Game Rules for the ART Testbed Competition 2006

View Registered Teams

View Game Rounds and Competition Results


Competition Rules

ART Testbed Competition 2006

See post-Preliminary Round updates here:

The following rules will be in effect for the 2006 ART Testbed Competition at AAMAS 2006.

Game Rules

  • All games will use the following game parameters (as set in the gameParameters.properties file):
    • Average-Clients-Per-Agent=20
    • Number-of-Painting-Eras=10
    • Sensing-Cost-Accuracy=0.5
    • Previous-Client-Share-Influence=0.1 (NOTE: this value is new as of April 25 - 2006)
    • Client-Fee=100.0
    • Opinion-Cost=10.0
    • Reputation-Cost=1.0
    • Thread-Wait=3
    • Timesteps-per-Session: The number of timesteps in a game will be selected randomly from a uniform distribution such that each game will require 30-60 minutes to complete.
  • Collusion among agents is strictly prohibited. More specifically, no agent is permitted to benefit from basing any portion of its strategy on non-public knowledge of another agent's strategy. Agents are prohibited from utilizing strategies designed to benefit another agent at the agent's own expense.
  • All participants are expected to follow the “spirit” of the ART Testbed Competition. Certainly, methods may exist for overcoming the Testbed’s security (falsifying messages from the Sim or other agents, accessing of the Database by agents, collusion between agents by previous arrangement of agent teams, writing to or reading from files on the Testbed host, interrupting processes). However, these covert strategies benefit no one, since they delay the advancement of trust research and put the offenders at risk of harming their own reputations within the research community. The competition organizers acknowledge that this first competition may have weaknesses; but as a tool for the entire trust research community, through contributions from all, future iterations of the Testbed will be more robust. Competition organizers reserve the right to disqualify participants who do not follow the spirit of the competition. Further, the organizers may make additional rule judgments as needed.

Submitting Agents

  • The deadline for submitting agent class/jar files to the competition organizers is April 30th at midnight, GMT (which corresponds to 7pm 4/30 for U.S. Central, 1am 5/1 for U.K., 2am 5/1 for Western Europe).
  • Agents should be submitted via email to kfullam@lips.utexas.edu as a SINGLE .class or .jar file. The subject line of the email should be your agent’s name.
  • The name of the submitted file should exactly match your agent name listed on the Registered Agents page. (for example, Neil.class or Hal9000.jar).
  • Submitted files should be “join-game-ready”, which means they should run correctly when joined to a game via the Game Setup Interface’s Join Game Panel. Class files should belong to package “testbed.participants”. Jar files should include a manifest file with the following line:

    Main-Class: testbed.participants.<agentClass>

    where <agentClass> is the name of the class inheriting from the abstract testbed.agent.Agent class.

  • Agents should be compatible with Testbed version 0.3.1 (http://sourceforge.net/project/showfiles.php?group_id=148987).
  • Participants are strongly encouraged to submit a working (though not necessarily final) “test” version of their agents AT LEAST ONE DAY BEFORE the April 30th deadline. Participants meeting this “test” deadline will receive confirmation on April 30th from the organizers that their agents are functioning properly in the official Testbed environment. Of course, participants may submit final versions of their agents up until the April 30th deadline (midnight GMT); the “final” agent version may be different from the “test” version.

Preliminary Round

  • Games for the Preliminary Round have been organized (see competition_results_prelim.htm). Twenty-one games will be conducted, during which 1) each agent will play in at least six games and 2) each agent plays every other agent in at least one game.
  • All games will be conducted with five participating agents. Each game will require 30-60 minutes to complete.
  • Each agent's score for the Preliminary Round will be an average of its scores for six games. The score for each game will be calculated as the normalized bank balance (bank balance divided by number of timesteps in the game). A few agents will play in more than six games; however, only 6 games will be averaged to compute each agent's score.
  • Games in the Preliminary Round will begin on May 2 and will be completed before May 10. To speed up the completion of the Preliminary Round, the set of games may be divided among multiple instances of the ART Testbed running on different computers, as supervised by the competition organizers.
  • Unfortunately, real-time, online viewing of games in progress has not yet been implemented. Therefore, competition organizers will post scores (competition_results_prelim.htm) for games in the Preliminary Round after the Final Round has begun.

Final Round

  • Games in the Final Round will take place on May 10-11.
  • Finalists will be selected as the five agents achieving the top five average scores in the Preliminary Round.
  • Ten games will be conducted among the finalists. All finalists will compete in all games. Each game will require approximately 60 minutes to complete.
  • The winning agent will be selected as the finalist achieving the highest average score over all games in the Final Round. The score for each game will be calculated as the normalized bank balance (bank balance divided by number of timesteps in the game).
  • Since real-time, online viewing of games in progress has not yet been implemented, competition organizers will post scores (competition_results_final.htm) for games in the Final Round as they are completed. Further, games will be shown in real-time at the Competition site at AAMAS.
  • All games in the Final Round will be run with agents submitted by the original (April 30) deadline (NOT the May 9 resubmission deadline for informal games).
Informal Games
  • In addition to the Final Round of the Competition, a series of informal games will take place during AAMAS (May 10-12).
  • All competition participants (finalists and non-finalists) are permitted to change their agent code and resubmit agents by May 9 at midnight GMT (9am AAMAS/Hakodate time) for the purpose of playing in informal games only. If a participant chooses not to submit an updated agent, the originally-submitted agent will be used.
  • Informal games will be conducted upon the request of a participant. To request an informal game, email kfullam@lips.utexas.edu with your agent's name and the names of requested competitors for that game.
  • Each participant can request multiple informal games at any time through May 12; however, scheduling of informal games is at the discretion of the Competition organizers.
  • Since real-time, online viewing of games in progress has not yet been implemented, competition organizers will post scores (competition_results_informal.htm) for informal games as they are completed. Further, games will be shown in real-time at the Competition site at AAMAS.

Liability

  • Problems may occur as the competition is conducted, especially since the ART Testbed Competition is in its infancy. By participating in the competition, participants agree that the competition organizers will not be held liable should the competition fail to occur or should participants disagree with a game rule judgment.
  • Participants acknowledge that results of all competition games will be publicly posted on the ART Testbed website.
  • Participants acknowledge that the competition organizers may write technical papers about the competition and results, including references to strategies employed by participating agents, with proper credit given.
  • It is the best intention of the competition organizers to provide participants with future venues for dissemination of results, including publication, presentations, and discussion opportunities.

Release of Agent Code

  • By participating in the ART Testbed Competition, competitors agree to release their .class and .jar files to the competition organizers for 1) future demonstrations of the ART Testbed and 2) posting on the ART Testbed website. In each case, proper credit to competitors will be given.
  • The competition organizers request that competition finalists submit their agent source code for posting on the ART Testbed website. In return, finalists will be given proper credit for their work, as well as an invitation to publish in an upcoming ART Testbed-related publication. Online posting of source code will not occur until 1) posting is explicitly authorized by participants or 2) publication by participants is in progress, whichever occurs first. Additionally, all other participants are invited to submit their source code for posting on the ART Testbed website.