You can subscribe to this list here.
| 2004 | 
          Jan
           (1)  | 
        
        
        
        
          Feb
           (1)  | 
        
        
        
        
          Mar
           (3)  | 
        
        
        
        
          Apr
           (12)  | 
        
        
        
        
          May
           (1)  | 
        
        
        
        
          Jun
           | 
        
        
        
        
          Jul
           | 
        
        
        
        
          Aug
           | 
        
        
        
        
          Sep
           (5)  | 
        
        
        
        
          Oct
           (3)  | 
        
        
        
        
          Nov
           (2)  | 
        
        
        
        
          Dec
           | 
        
      
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2005 | 
          Jan
           | 
        
        
        
        
          Feb
           | 
        
        
        
        
          Mar
           (6)  | 
        
        
        
        
          Apr
           (2)  | 
        
        
        
        
          May
           (3)  | 
        
        
        
        
          Jun
           (6)  | 
        
        
        
        
          Jul
           | 
        
        
        
        
          Aug
           (1)  | 
        
        
        
        
          Sep
           (5)  | 
        
        
        
        
          Oct
           (32)  | 
        
        
        
        
          Nov
           | 
        
        
        
        
          Dec
           (4)  | 
        
      
| 2006 | 
          Jan
           (1)  | 
        
        
        
        
          Feb
           (1)  | 
        
        
        
        
          Mar
           (3)  | 
        
        
        
        
          Apr
           (1)  | 
        
        
        
        
          May
           (1)  | 
        
        
        
        
          Jun
           | 
        
        
        
        
          Jul
           (1)  | 
        
        
        
        
          Aug
           (1)  | 
        
        
        
        
          Sep
           | 
        
        
        
        
          Oct
           (6)  | 
        
        
        
        
          Nov
           (6)  | 
        
        
        
        
          Dec
           | 
        
      
| 2007 | 
          Jan
           | 
        
        
        
        
          Feb
           (7)  | 
        
        
        
        
          Mar
           (20)  | 
        
        
        
        
          Apr
           (9)  | 
        
        
        
        
          May
           (4)  | 
        
        
        
        
          Jun
           (13)  | 
        
        
        
        
          Jul
           (9)  | 
        
        
        
        
          Aug
           (8)  | 
        
        
        
        
          Sep
           | 
        
        
        
        
          Oct
           (7)  | 
        
        
        
        
          Nov
           (7)  | 
        
        
        
        
          Dec
           (3)  | 
        
      
| 2008 | 
          Jan
           | 
        
        
        
        
          Feb
           (5)  | 
        
        
        
        
          Mar
           (1)  | 
        
        
        
        
          Apr
           (46)  | 
        
        
        
        
          May
           (7)  | 
        
        
        
        
          Jun
           (5)  | 
        
        
        
        
          Jul
           (1)  | 
        
        
        
        
          Aug
           (15)  | 
        
        
        
        
          Sep
           (19)  | 
        
        
        
        
          Oct
           | 
        
        
        
        
          Nov
           (2)  | 
        
        
        
        
          Dec
           (1)  | 
        
      
| 2009 | 
          Jan
           (3)  | 
        
        
        
        
          Feb
           (2)  | 
        
        
        
        
          Mar
           (10)  | 
        
        
        
        
          Apr
           (16)  | 
        
        
        
        
          May
           (18)  | 
        
        
        
        
          Jun
           (12)  | 
        
        
        
        
          Jul
           (13)  | 
        
        
        
        
          Aug
           (10)  | 
        
        
        
        
          Sep
           (5)  | 
        
        
        
        
          Oct
           | 
        
        
        
        
          Nov
           (2)  | 
        
        
        
        
          Dec
           (3)  | 
        
      
| 2010 | 
          Jan
           (17)  | 
        
        
        
        
          Feb
           (10)  | 
        
        
        
        
          Mar
           (3)  | 
        
        
        
        
          Apr
           (2)  | 
        
        
        
        
          May
           (12)  | 
        
        
        
        
          Jun
           (17)  | 
        
        
        
        
          Jul
           (27)  | 
        
        
        
        
          Aug
           (20)  | 
        
        
        
        
          Sep
           (8)  | 
        
        
        
        
          Oct
           (12)  | 
        
        
        
        
          Nov
           (3)  | 
        
        
        
        
          Dec
           (2)  | 
        
      
| 2011 | 
          Jan
           (16)  | 
        
        
        
        
          Feb
           (6)  | 
        
        
        
        
          Mar
           (3)  | 
        
        
        
        
          Apr
           (2)  | 
        
        
        
        
          May
           | 
        
        
        
        
          Jun
           (11)  | 
        
        
        
        
          Jul
           (4)  | 
        
        
        
        
          Aug
           (9)  | 
        
        
        
        
          Sep
           (10)  | 
        
        
        
        
          Oct
           (8)  | 
        
        
        
        
          Nov
           (10)  | 
        
        
        
        
          Dec
           (3)  | 
        
      
| 2012 | 
          Jan
           (6)  | 
        
        
        
        
          Feb
           (4)  | 
        
        
        
        
          Mar
           (3)  | 
        
        
        
        
          Apr
           (10)  | 
        
        
        
        
          May
           (7)  | 
        
        
        
        
          Jun
           (5)  | 
        
        
        
        
          Jul
           (4)  | 
        
        
        
        
          Aug
           (18)  | 
        
        
        
        
          Sep
           (14)  | 
        
        
        
        
          Oct
           (17)  | 
        
        
        
        
          Nov
           (12)  | 
        
        
        
        
          Dec
           | 
        
      
| 2013 | 
          Jan
           (6)  | 
        
        
        
        
          Feb
           (7)  | 
        
        
        
        
          Mar
           (4)  | 
        
        
        
        
          Apr
           (8)  | 
        
        
        
        
          May
           (5)  | 
        
        
        
        
          Jun
           (7)  | 
        
        
        
        
          Jul
           (4)  | 
        
        
        
        
          Aug
           (1)  | 
        
        
        
        
          Sep
           (3)  | 
        
        
        
        
          Oct
           | 
        
        
        
        
          Nov
           | 
        
        
        
        
          Dec
           (6)  | 
        
      
| 2014 | 
          Jan
           (4)  | 
        
        
        
        
          Feb
           | 
        
        
        
        
          Mar
           (6)  | 
        
        
        
        
          Apr
           (2)  | 
        
        
        
        
          May
           | 
        
        
        
        
          Jun
           (10)  | 
        
        
        
        
          Jul
           (1)  | 
        
        
        
        
          Aug
           (2)  | 
        
        
        
        
          Sep
           (1)  | 
        
        
        
        
          Oct
           | 
        
        
        
        
          Nov
           | 
        
        
        
        
          Dec
           | 
        
      
| 2015 | 
          Jan
           | 
        
        
        
        
          Feb
           | 
        
        
        
        
          Mar
           (3)  | 
        
        
        
        
          Apr
           (3)  | 
        
        
        
        
          May
           (7)  | 
        
        
        
        
          Jun
           (5)  | 
        
        
        
        
          Jul
           (1)  | 
        
        
        
        
          Aug
           (3)  | 
        
        
        
        
          Sep
           (2)  | 
        
        
        
        
          Oct
           (2)  | 
        
        
        
        
          Nov
           (6)  | 
        
        
        
        
          Dec
           (3)  | 
        
      
| 2016 | 
          Jan
           (2)  | 
        
        
        
        
          Feb
           | 
        
        
        
        
          Mar
           | 
        
        
        
        
          Apr
           (7)  | 
        
        
        
        
          May
           | 
        
        
        
        
          Jun
           (5)  | 
        
        
        
        
          Jul
           (1)  | 
        
        
        
        
          Aug
           (2)  | 
        
        
        
        
          Sep
           (5)  | 
        
        
        
        
          Oct
           (5)  | 
        
        
        
        
          Nov
           (2)  | 
        
        
        
        
          Dec
           | 
        
      
| 2017 | 
          Jan
           (5)  | 
        
        
        
        
          Feb
           (4)  | 
        
        
        
        
          Mar
           (3)  | 
        
        
        
        
          Apr
           (6)  | 
        
        
        
        
          May
           | 
        
        
        
        
          Jun
           | 
        
        
        
        
          Jul
           | 
        
        
        
        
          Aug
           (2)  | 
        
        
        
        
          Sep
           (1)  | 
        
        
        
        
          Oct
           | 
        
        
        
        
          Nov
           (1)  | 
        
        
        
        
          Dec
           | 
        
      
| 2018 | 
          Jan
           (2)  | 
        
        
        
        
          Feb
           | 
        
        
        
        
          Mar
           | 
        
        
        
        
          Apr
           | 
        
        
        
        
          May
           (6)  | 
        
        
        
        
          Jun
           (1)  | 
        
        
        
        
          Jul
           (3)  | 
        
        
        
        
          Aug
           (2)  | 
        
        
        
        
          Sep
           | 
        
        
        
        
          Oct
           (6)  | 
        
        
        
        
          Nov
           (4)  | 
        
        
        
        
          Dec
           | 
        
      
| 2019 | 
          Jan
           (1)  | 
        
        
        
        
          Feb
           (5)  | 
        
        
        
        
          Mar
           (6)  | 
        
        
        
        
          Apr
           (5)  | 
        
        
        
        
          May
           | 
        
        
        
        
          Jun
           (5)  | 
        
        
        
        
          Jul
           (2)  | 
        
        
        
        
          Aug
           (7)  | 
        
        
        
        
          Sep
           (1)  | 
        
        
        
        
          Oct
           (7)  | 
        
        
        
        
          Nov
           | 
        
        
        
        
          Dec
           | 
        
      
| 2020 | 
          Jan
           | 
        
        
        
        
          Feb
           | 
        
        
        
        
          Mar
           | 
        
        
        
        
          Apr
           | 
        
        
        
        
          May
           | 
        
        
        
        
          Jun
           | 
        
        
        
        
          Jul
           | 
        
        
        
        
          Aug
           | 
        
        
        
        
          Sep
           | 
        
        
        
        
          Oct
           (1)  | 
        
        
        
        
          Nov
           (2)  | 
        
        
        
        
          Dec
           | 
        
      
| 2021 | 
          Jan
           | 
        
        
        
        
          Feb
           | 
        
        
        
        
          Mar
           | 
        
        
        
        
          Apr
           | 
        
        
        
        
          May
           | 
        
        
        
        
          Jun
           | 
        
        
        
        
          Jul
           | 
        
        
        
        
          Aug
           (2)  | 
        
        
        
        
          Sep
           (2)  | 
        
        
        
        
          Oct
           (3)  | 
        
        
        
        
          Nov
           | 
        
        
        
        
          Dec
           | 
        
      
| 2022 | 
          Jan
           (5)  | 
        
        
        
        
          Feb
           (1)  | 
        
        
        
        
          Mar
           | 
        
        
        
        
          Apr
           | 
        
        
        
        
          May
           (5)  | 
        
        
        
        
          Jun
           (9)  | 
        
        
        
        
          Jul
           | 
        
        
        
        
          Aug
           | 
        
        
        
        
          Sep
           (2)  | 
        
        
        
        
          Oct
           | 
        
        
        
        
          Nov
           (4)  | 
        
        
        
        
          Dec
           | 
        
      
| 2023 | 
          Jan
           | 
        
        
        
        
          Feb
           | 
        
        
        
        
          Mar
           (1)  | 
        
        
        
        
          Apr
           (4)  | 
        
        
        
        
          May
           | 
        
        
        
        
          Jun
           (5)  | 
        
        
        
        
          Jul
           (1)  | 
        
        
        
        
          Aug
           | 
        
        
        
        
          Sep
           (3)  | 
        
        
        
        
          Oct
           | 
        
        
        
        
          Nov
           | 
        
        
        
        
          Dec
           (3)  | 
        
      
| 2024 | 
          Jan
           (7)  | 
        
        
        
        
          Feb
           (1)  | 
        
        
        
        
          Mar
           | 
        
        
        
        
          Apr
           (17)  | 
        
        
        
        
          May
           | 
        
        
        
        
          Jun
           | 
        
        
        
        
          Jul
           | 
        
        
        
        
          Aug
           | 
        
        
        
        
          Sep
           | 
        
        
        
        
          Oct
           | 
        
        
        
        
          Nov
           | 
        
        
        
        
          Dec
           | 
        
      
| 2025 | 
          Jan
           | 
        
        
        
        
          Feb
           | 
        
        
        
        
          Mar
           (3)  | 
        
        
        
        
          Apr
           | 
        
        
        
        
          May
           | 
        
        
        
        
          Jun
           | 
        
        
        
        
          Jul
           | 
        
        
        
        
          Aug
           | 
        
        
        
        
          Sep
           | 
        
        
        
        
          Oct
           | 
        
        
        
        
          Nov
           | 
        
        
        
        
          Dec
           | 
        
      
| S | M | T | W | T | F | S | 
|---|---|---|---|---|---|---|
| 
           
           | 
        
        
           
           | 
        
        
           
           | 
        
        
          1
           | 
        
        
          2
           | 
        
        
          3
           | 
        
        
          4
           | 
        
      
| 
          5
           | 
        
        
          6
           | 
        
        
          7
           | 
        
        
          8
           | 
        
        
          9
           | 
        
        
          10
           | 
        
        
          11
           | 
        
      
| 
          12
           | 
        
        
          13
           | 
        
        
          14
           | 
        
        
          15
           (5)  | 
        
        
          16
           (5)  | 
        
        
          17
           | 
        
        
          18
           | 
        
      
| 
          19
           (1)  | 
        
        
          20
           (1)  | 
        
        
          21
           | 
        
        
          22
           | 
        
        
          23
           (1)  | 
        
        
          24
           | 
        
        
          25
           | 
        
      
| 
          26
           | 
        
        
          27
           | 
        
        
          28
           | 
        
        
          29
           | 
        
        
          30
           | 
        
        
          31
           | 
        
        
           
           | 
        
      
| 
     
      
      
      From: Omar C. <oma...@gm...> - 2009-07-23 03:47:02
      
     
   | 
I'm doing the example posted as an introduction of saci, but with not look
of working.
I'm runnig master saci on windows and saci-client on linux.
Mas File is:
MAS dis{
 infrastructure: Saci
 agents:
  bob at "148.226.110.153";
  alice at "148.226.110.239";
}
and agents asl are
Bob agent
+!start
 : true
 <-  .wait(1500);
  .send(alice,tell,hello).
+hello[source(A)]
 <- .print("recibido hola de ", A)
Alice agent
+hello[source(A)]
 <- .print("recibido hola de ",A);
  .send(A, tell, hello).
I run ant saci at 148.226.110.153 (windows)
and ant saci-client on 148.226.110.239 (LInux)
then i run ant on linux and i get this error on console
[bob] The content of the message '(tell :sender bob :content hello(uno)
:receiver alice :language AgentSpeak :reply-with mid1 )' is not a term
and in the system i get
     [java] [CommSAg,bob] Receiver environment was not found
     [java] [CommSAg,bob] Receiver alice was not found
What could be wrong. Please help.
I've read the faq, but not look about it...
Thanks in advance.
 | 
| 
     
      
      
      From: Jomi H. <jom...@gm...> - 2009-07-20 15:23:07
      
     
   | 
Hi Sebastien,
we do not have a method for the environment to inspect the agent's  
mind (it is conceptually strange), but there are several ways to solve  
your problem. Some ideas:
1. using a controller that gets a copy of the belief base of all  
agents in XML. The problem is to handle the XML to find the  
information. You find an example at demos/controller.
2. assuming you are using the centralised infrastructure, you can get  
a reference to RunCentralisedMAs that has references to all agents  
where you can inspect the BB. (ask me details if you do not find the  
'path' in the API.)
3. create an agent for that. in the end of the cycle this agent asks  
the skills of the others:
	.send(Ag,askOne, skill(_), Answer);
         // you may find useful the internal action .all_agents that  
gets a list of all names
     or
  	.broadcast,askOne, skill(_));
     in the second case the answers will be added in the BB with  
corresponding source annotations.
HTH,
Jomi
On Jul 19, 2009, at 4:53 PM, Sebastien Mordelet wrote:
> Hi,
>
> thanks for making this point cleared! Now this is working "almost"  
> well
> (I need to tune my MAS sharper).
> I have a question more about environment :
> I need at the end of each step to make a kind of mean based on the sum
> of a belief detained by each agent. By example : skill(34).
> So...I know how to addpercept to agent, but what the reverse  
> operation is ?
> I saw getPercept ? Is this it ? How to use it ?
>
> Thanks a lot
>
>>
>
> ------------------------------------------------------------------------------
> Enter the BlackBerry Developer Challenge
> This is your chance to win up to $100,000 in prizes! For a limited  
> time,
> vendors submitting new applications to BlackBerry App World(TM) will  
> have
> the opportunity to enter the BlackBerry Developer Challenge. See  
> full prize
> details at: http://p.sf.net/sfu/Challenge
> _______________________________________________
> Jason-users mailing list
> Jas...@li...
> https://lists.sourceforge.net/lists/listinfo/jason-users
 | 
| 
     
      
      
      From: Sebastien M. <se...@gm...> - 2009-07-19 19:53:54
      
     
   | 
Hi, thanks for making this point cleared! Now this is working "almost" well (I need to tune my MAS sharper). I have a question more about environment : I need at the end of each step to make a kind of mean based on the sum of a belief detained by each agent. By example : skill(34). So...I know how to addpercept to agent, but what the reverse operation is ? I saw getPercept ? Is this it ? How to use it ? Thanks a lot >  | 
| 
     
      
      
      From: Felipe R. M. <fel...@gm...> - 2009-07-16 23:38:11
      
     
   | 
Now I got it, I did not realize that the variables were the problem. And I agree that Rafael's solution to my problem is the best one, I partially tried to make strings out of the other terms of the predicate, so I will do the same with the rest. I intend to handle them internally (in Java) anyway. Thanks for the tips. And sorry about sending this question to the wrong email address. My parsing was initially done with agent (I was just experimenting with asl files really), but now I will correct my definitions. The idea of what I am doing is to receive norms via Jason's percepts and then processing them (similar to my paper at AAMAS), but this time I have the notion of abstract and concrete norms, so an agent would be aware of an abstract norm (with variables) and then create instances of the norm based on the unifier obtained from matching an activation condition in the norm. Regards, Felipe On Thu, Jul 16, 2009 at 4:01 PM, Jomi Hubner <jom...@gm...> wrote: > Hi Felipe, > which method are you using to parse? > > BTW, in Jason syntax, since beliefs have to be ground, anything with vars > is considered as a rule. > If you are using 'belief' from the JavaCC, it is normal that those predicates are parsed as rules. However, if you are using 'literal', they should be parsed as a literal (and not as rule). > > Jomi > > On Jul 16, 2009, at 3:48 PM, Felipe Rech Meneguzzi wrote: > > Hi all, > > I was doing some experiments with Jason and I noticed that some > predicates with which I initialized an agent expecting them to be beliefs > ended up being considered rules, for example, the two predicates below are > parsed as rules (becoming themselves the head of a rule with the body being > true). > > norm(obligation, > evacuate(P,X,Y), > "10<=X & X<=40 & 20<=Y & Y<=80", > "at_loc(P,X) & unsafe(X) & safe(Y)", > emergency_level(X,low), > 12). > > event(a(B),c(DS)). > > > It seems that whenever a predicate has terms with variables in them, they > are considered rules for some reason, the problem with this is that I > expected them to trigger initialization plans, and with them being rules, > these plans are never invoked. Is that the expected behaviour? > > Regards, > > -- > ___________________________ > Felipe Rech Meneguzzi > fel...@gm... > http://fmeneguzzi.blogspot.com > ___________________________ > > ------------------------------------------------------------------------------ > Enter the BlackBerry Developer Challenge > This is your chance to win up to $100,000 in prizes! For a limited time, > vendors submitting new applications to BlackBerry App World(TM) will have > the opportunity to enter the BlackBerry Developer Challenge. See full prize > > details at: > http://p.sf.net/sfu/Challenge_______________________________________________ > Jason-developers mailing list > Jas...@li... > https://lists.sourceforge.net/lists/listinfo/jason-developers > > > -- > Jomi Fred Hubner > ENS Mines Saint-Etienne > 158 Cours Fauriel > 42023 Saint-Etienne Cedex 02 > France > http://www.emse.fr/~hubner <http://www.emse.fr/%7Ehubner> > > -- ___________________________ Felipe Rech Meneguzzi fel...@gm... http://fmeneguzzi.blogspot.com ___________________________  | 
| 
     
      
      
      From: Rafael H B. <r.b...@ac...> - 2009-07-16 19:17:34
      
     
   | 
Hi Sebastian, No, this is a type of *environment* that wait for actions from all agents before going to the next step (unless it times out, if I remember well), only *actions* are visible to the environment. Internal actions are executed as part of the agent's own reasoning cycle, so they don't count for the actions the environment is waiting for. Note that you might need lots of AgentSpeak reasoning cycles before an environment action execution request is sent to the Environment class. Cheers, Rafael Sebastien Mordelet wrote: > Hi, > > thanks for your response. In fact, I already used the queue policy. But > whatever policy I tried the same occurs. So may be I misunderstood > something. > 1 - My environnement only contains the overrided methods that occurs > before and after step. And that's all. So may be I need to something > else like something in the example game-of-life where all actions > possible appears somewhere in the environement file ? > 2 - Do steps concern every kind of actions ? Or only actions on the > environnement ? Could internal actions be stepped ? > > Thanks > > Jomi Hubner a écrit : >> Hi Sebastien, >> >> I am not sure, but maybe the 'OverActionsPolicy' could be a solution >> for your problem. The issue is: what to do with the second, third, >> .... actions asked by the agents in the same cycle. The >> TimeSteppedEnvironment has three options: >> >> queue: the second action will be executed in the next cycle >> failSecond: the second action fail >> ignoreSecond: the second action will not fail but is simply >> ignored/discarded >> >> the latter is the default. So, if your agents are doing more than one >> action in the same cycle, some of them are being ignored (without error). >> >> You can change the policy using the setOverActionsPolicy method (see >> API doc for more details). >> >> Note also that this implementation of the environment does not affect >> directly the way the agents behave. They can 'run' without taking care >> of any rule. If you want them to respect one action by cycle, you need >> to program them for that (or set queue as policy). >> >> HTH, >> >> Jomi >> >> On Jul 15, 2009, at 5:40 PM, Sebastien Mordelet wrote: >> >>> Thanks again ! >>> I can't figure out how you manage to stay on the cutting edge the way >>> you do ! >>> >>> I did exactly what you did. Steps started but it seems to me that >>> agents are living their life without observing any rules (1 step = 1 >>> action). >>> I ran my system several times, and I obtained different reponses : some >>> times agents did almost all their actions without observing the kind of >>> pause I suppose they should, >>> some times they didn't. >>> >>> So, when I look at the game-of-life example, it seems to me that all >>> the actions are defined in the environnement. Is there a kind of link >>> between that and my problem ? Do the steps only concern the >>> action defined in the environnement ? >>> >>> Thanks >>> >>> >>> Jomi Hubner a écrit : >>>> Hi Sebastien, >>>> >>>> you are right, the timeout for cycles starts counting only after the >>>> first action performed by an agent. >>>> >>>> this problem will be solved in the next release of Jason, by now you >>>> may include a 'fake' action in some agent: >>>> >>>> !startCycle. >>>> +!startCycle <- donothing. >>>> >>>> HTH, >>>> >>>> Jomi >>>> >>>> On Jul 15, 2009, at 10:03 AM, Sebastien Mordelet wrote: >>>> >>>> >>>>> Thanks a lot. You saved me a lot of hours... I was searching my way >>>>> since yesterday afternoon.... >>>>> >>>>> May be I can push an other point : >>>>> I am using a timestepped environnement. And the fact is that when I >>>>> start the simulation, after agents are initialized nothing >>>>> happens...no >>>>> first step. The only way I found to make it starts is to put in my >>>>> agents an internal method unknown by the system. shortly after saying >>>>> the method is unknow I can steps starting.... >>>>> Is there a safer boostrap ? >>>>> >>>>> Thanks >>>>> >>>>> Rafael H Bordini a écrit : >>>>> >>>>>> Hi Sebastien, >>>>>> >>>>>> There are various ways of doing this. The environment is a >>>>>> possibility >>>>>> but then you'd make sure you never remove that percept (if the >>>>>> percept >>>>>> is no longer in the environment, Jason will automatically remove the >>>>>> belief when the agent does belief update). >>>>>> >>>>>> There is an agent initialisation method that you could use if you >>>>>> customise the agent architecture but there's no need to go into that >>>>>> level of programming, you can keep it all in the AgentSpeak side. Try >>>>>> something like (not tested): >>>>>> >>>>>> // initial goal to initialise itself >>>>>> !init. >>>>>> >>>>>> // when having this goal, add the skill belief >>>>>> +!init <- .random(N); +skill(N). >>>>>> >>>>>> If you have more common parts to all agents, you can also put that >>>>>> bit >>>>>> of code in a file and use the Include directive in all agents, to >>>>>> avoid copying the same code and having to change each agent if there >>>>>> are changes in that code. >>>>>> >>>>>> HTH, >>>>>> >>>>>> Rafael >>>>>> >>>>>> >>>>>> Sebastien Mordelet wrote: >>>>>> >>>>>>> Hello >>>>>>> >>>>>>> I am currently working on a project where I need BDI agents.So I >>>>>>> came >>>>>>> to JASON which is totally new to me. >>>>>>> >>>>>>> I am trying to have all my agents initialized with a belief : >>>>>>> skill(n), where n is a random. >>>>>>> >>>>>>> Where I can do that? In the environeement ? >>>>>>> And if yes, is there any method used to initialize agents ?I didn't >>>>>>> find things like that in the doc.. >>>>>>> >>>>>>> thanks a lot >>>>>>> >>>>>>> ------------------------------------------------------------------------------ >>>>>>> >>>>>>> >>>>>>> Enter the BlackBerry Developer Challenge This is your chance to win >>>>>>> up to $100,000 in prizes! For a limited time, vendors submitting new >>>>>>> applications to BlackBerry App World(TM) will have >>>>>>> the opportunity to enter the BlackBerry Developer Challenge. See >>>>>>> full >>>>>>> prize details at: http://p.sf.net/sfu/Challenge >>>>>>> _______________________________________________ >>>>>>> Jason-users mailing list >>>>>>> Jas...@li... >>>>>>> https://lists.sourceforge.net/lists/listinfo/jason-users >>>>>>> >>>>> ------------------------------------------------------------------------------ >>>>> >>>>> Enter the BlackBerry Developer Challenge >>>>> This is your chance to win up to $100,000 in prizes! For a limited >>>>> time, >>>>> vendors submitting new applications to BlackBerry App World(TM) will >>>>> have >>>>> the opportunity to enter the BlackBerry Developer Challenge. See >>>>> full prize >>>>> details at: http://p.sf.net/sfu/Challenge >>>>> _______________________________________________ >>>>> Jason-users mailing list >>>>> Jas...@li... >>>>> https://lists.sourceforge.net/lists/listinfo/jason-users >>>>> >>>> >>> ------------------------------------------------------------------------------ >>> >>> Enter the BlackBerry Developer Challenge >>> This is your chance to win up to $100,000 in prizes! For a limited time, >>> vendors submitting new applications to BlackBerry App World(TM) will >>> have >>> the opportunity to enter the BlackBerry Developer Challenge. See full >>> prize >>> details at: http://p.sf.net/sfu/Challenge >>> _______________________________________________ >>> Jason-users mailing list >>> Jas...@li... >>> https://lists.sourceforge.net/lists/listinfo/jason-users > > ------------------------------------------------------------------------------ > Enter the BlackBerry Developer Challenge > This is your chance to win up to $100,000 in prizes! For a limited time, > vendors submitting new applications to BlackBerry App World(TM) will have > the opportunity to enter the BlackBerry Developer Challenge. See full prize > details at: http://p.sf.net/sfu/Challenge > _______________________________________________ > Jason-users mailing list > Jas...@li... > https://lists.sourceforge.net/lists/listinfo/jason-users  | 
| 
     
      
      
      From: Sebastien M. <se...@gm...> - 2009-07-16 19:08:27
      
     
   | 
Hi, thanks for your response. In fact, I already used the queue policy. But whatever policy I tried the same occurs. So may be I misunderstood something. 1 - My environnement only contains the overrided methods that occurs before and after step. And that's all. So may be I need to something else like something in the example game-of-life where all actions possible appears somewhere in the environement file ? 2 - Do steps concern every kind of actions ? Or only actions on the environnement ? Could internal actions be stepped ? Thanks Jomi Hubner a écrit : > Hi Sebastien, > > I am not sure, but maybe the 'OverActionsPolicy' could be a solution > for your problem. The issue is: what to do with the second, third, > .... actions asked by the agents in the same cycle. The > TimeSteppedEnvironment has three options: > > queue: the second action will be executed in the next cycle > failSecond: the second action fail > ignoreSecond: the second action will not fail but is simply > ignored/discarded > > the latter is the default. So, if your agents are doing more than one > action in the same cycle, some of them are being ignored (without error). > > You can change the policy using the setOverActionsPolicy method (see > API doc for more details). > > Note also that this implementation of the environment does not affect > directly the way the agents behave. They can 'run' without taking care > of any rule. If you want them to respect one action by cycle, you need > to program them for that (or set queue as policy). > > HTH, > > Jomi > > On Jul 15, 2009, at 5:40 PM, Sebastien Mordelet wrote: > >> Thanks again ! >> I can't figure out how you manage to stay on the cutting edge the way >> you do ! >> >> I did exactly what you did. Steps started but it seems to me that >> agents are living their life without observing any rules (1 step = 1 >> action). >> I ran my system several times, and I obtained different reponses : some >> times agents did almost all their actions without observing the kind of >> pause I suppose they should, >> some times they didn't. >> >> So, when I look at the game-of-life example, it seems to me that all >> the actions are defined in the environnement. Is there a kind of link >> between that and my problem ? Do the steps only concern the >> action defined in the environnement ? >> >> Thanks >> >> >> Jomi Hubner a écrit : >>> Hi Sebastien, >>> >>> you are right, the timeout for cycles starts counting only after the >>> first action performed by an agent. >>> >>> this problem will be solved in the next release of Jason, by now you >>> may include a 'fake' action in some agent: >>> >>> !startCycle. >>> +!startCycle <- donothing. >>> >>> HTH, >>> >>> Jomi >>> >>> On Jul 15, 2009, at 10:03 AM, Sebastien Mordelet wrote: >>> >>> >>>> Thanks a lot. You saved me a lot of hours... I was searching my way >>>> since yesterday afternoon.... >>>> >>>> May be I can push an other point : >>>> I am using a timestepped environnement. And the fact is that when I >>>> start the simulation, after agents are initialized nothing >>>> happens...no >>>> first step. The only way I found to make it starts is to put in my >>>> agents an internal method unknown by the system. shortly after saying >>>> the method is unknow I can steps starting.... >>>> Is there a safer boostrap ? >>>> >>>> Thanks >>>> >>>> Rafael H Bordini a écrit : >>>> >>>>> Hi Sebastien, >>>>> >>>>> There are various ways of doing this. The environment is a >>>>> possibility >>>>> but then you'd make sure you never remove that percept (if the >>>>> percept >>>>> is no longer in the environment, Jason will automatically remove the >>>>> belief when the agent does belief update). >>>>> >>>>> There is an agent initialisation method that you could use if you >>>>> customise the agent architecture but there's no need to go into that >>>>> level of programming, you can keep it all in the AgentSpeak side. Try >>>>> something like (not tested): >>>>> >>>>> // initial goal to initialise itself >>>>> !init. >>>>> >>>>> // when having this goal, add the skill belief >>>>> +!init <- .random(N); +skill(N). >>>>> >>>>> If you have more common parts to all agents, you can also put that >>>>> bit >>>>> of code in a file and use the Include directive in all agents, to >>>>> avoid copying the same code and having to change each agent if there >>>>> are changes in that code. >>>>> >>>>> HTH, >>>>> >>>>> Rafael >>>>> >>>>> >>>>> Sebastien Mordelet wrote: >>>>> >>>>>> Hello >>>>>> >>>>>> I am currently working on a project where I need BDI agents.So I >>>>>> came >>>>>> to JASON which is totally new to me. >>>>>> >>>>>> I am trying to have all my agents initialized with a belief : >>>>>> skill(n), where n is a random. >>>>>> >>>>>> Where I can do that? In the environeement ? >>>>>> And if yes, is there any method used to initialize agents ?I didn't >>>>>> find things like that in the doc.. >>>>>> >>>>>> thanks a lot >>>>>> >>>>>> ------------------------------------------------------------------------------ >>>>>> >>>>>> >>>>>> Enter the BlackBerry Developer Challenge This is your chance to win >>>>>> up to $100,000 in prizes! For a limited time, vendors submitting new >>>>>> applications to BlackBerry App World(TM) will have >>>>>> the opportunity to enter the BlackBerry Developer Challenge. See >>>>>> full >>>>>> prize details at: http://p.sf.net/sfu/Challenge >>>>>> _______________________________________________ >>>>>> Jason-users mailing list >>>>>> Jas...@li... >>>>>> https://lists.sourceforge.net/lists/listinfo/jason-users >>>>>> >>>> ------------------------------------------------------------------------------ >>>> >>>> Enter the BlackBerry Developer Challenge >>>> This is your chance to win up to $100,000 in prizes! For a limited >>>> time, >>>> vendors submitting new applications to BlackBerry App World(TM) will >>>> have >>>> the opportunity to enter the BlackBerry Developer Challenge. See >>>> full prize >>>> details at: http://p.sf.net/sfu/Challenge >>>> _______________________________________________ >>>> Jason-users mailing list >>>> Jas...@li... >>>> https://lists.sourceforge.net/lists/listinfo/jason-users >>>> >>> >>> >> >> ------------------------------------------------------------------------------ >> >> Enter the BlackBerry Developer Challenge >> This is your chance to win up to $100,000 in prizes! For a limited time, >> vendors submitting new applications to BlackBerry App World(TM) will >> have >> the opportunity to enter the BlackBerry Developer Challenge. See full >> prize >> details at: http://p.sf.net/sfu/Challenge >> _______________________________________________ >> Jason-users mailing list >> Jas...@li... >> https://lists.sourceforge.net/lists/listinfo/jason-users >  | 
| 
     
      
      
      From: Jomi H. <jom...@gm...> - 2009-07-16 19:02:11
      
     
   | 
Hi Felipe, which method are you using to parse? BTW, in Jason syntax, since beliefs have to be ground, anything with vars is considered as a rule. If you are using 'belief' from the JavaCC, it is normal that those predicates are parsed as rules. However, if you are using 'literal', they should be parsed as a literal (and not as rule). Jomi On Jul 16, 2009, at 3:48 PM, Felipe Rech Meneguzzi wrote: > Hi all, > > I was doing some experiments with Jason and I noticed that some > predicates with which I initialized an agent expecting them to be > beliefs ended up being considered rules, for example, the two > predicates below are parsed as rules (becoming themselves the head > of a rule with the body being true). > > norm(obligation, > evacuate(P,X,Y), > "10<=X & X<=40 & 20<=Y & Y<=80", > "at_loc(P,X) & unsafe(X) & safe(Y)", > emergency_level(X,low), > 12). > > event(a(B),c(DS)). > > > It seems that whenever a predicate has terms with variables in them, > they are considered rules for some reason, the problem with this is > that I expected them to trigger initialization plans, and with them > being rules, these plans are never invoked. Is that the expected > behaviour? > > Regards, > > -- > ___________________________ > Felipe Rech Meneguzzi > fel...@gm... > http://fmeneguzzi.blogspot.com > ___________________________ > ------------------------------------------------------------------------------ > Enter the BlackBerry Developer Challenge > This is your chance to win up to $100,000 in prizes! For a limited > time, > vendors submitting new applications to BlackBerry App World(TM) will > have > the opportunity to enter the BlackBerry Developer Challenge. See > full prize > details at: http://p.sf.net/sfu/Challenge_______________________________________________ > Jason-developers mailing list > Jas...@li... > https://lists.sourceforge.net/lists/listinfo/jason-developers -- Jomi Fred Hubner ENS Mines Saint-Etienne 158 Cours Fauriel 42023 Saint-Etienne Cedex 02 France http://www.emse.fr/~hubner  | 
| 
     
      
      
      From: Jomi H. <jom...@gm...> - 2009-07-16 13:38:01
      
     
   | 
Hi Sebastien, I am not sure, but maybe the 'OverActionsPolicy' could be a solution for your problem. The issue is: what to do with the second, third, .... actions asked by the agents in the same cycle. The TimeSteppedEnvironment has three options: queue: the second action will be executed in the next cycle failSecond: the second action fail ignoreSecond: the second action will not fail but is simply ignored/ discarded the latter is the default. So, if your agents are doing more than one action in the same cycle, some of them are being ignored (without error). You can change the policy using the setOverActionsPolicy method (see API doc for more details). Note also that this implementation of the environment does not affect directly the way the agents behave. They can 'run' without taking care of any rule. If you want them to respect one action by cycle, you need to program them for that (or set queue as policy). HTH, Jomi On Jul 15, 2009, at 5:40 PM, Sebastien Mordelet wrote: > Thanks again ! > I can't figure out how you manage to stay on the cutting edge the way > you do ! > > I did exactly what you did. Steps started but it seems to me that > agents are living their life without observing any rules (1 step = 1 > action). > I ran my system several times, and I obtained different reponses : > some > times agents did almost all their actions without observing the kind > of > pause I suppose they should, > some times they didn't. > > So, when I look at the game-of-life example, it seems to me that all > the actions are defined in the environnement. Is there a kind of link > between that and my problem ? Do the steps only concern the > action defined in the environnement ? > > Thanks > > > Jomi Hubner a écrit : >> Hi Sebastien, >> >> you are right, the timeout for cycles starts counting only after the >> first action performed by an agent. >> >> this problem will be solved in the next release of Jason, by now you >> may include a 'fake' action in some agent: >> >> !startCycle. >> +!startCycle <- donothing. >> >> HTH, >> >> Jomi >> >> On Jul 15, 2009, at 10:03 AM, Sebastien Mordelet wrote: >> >> >>> Thanks a lot. You saved me a lot of hours... I was searching my way >>> since yesterday afternoon.... >>> >>> May be I can push an other point : >>> I am using a timestepped environnement. And the fact is that when I >>> start the simulation, after agents are initialized nothing >>> happens...no >>> first step. The only way I found to make it starts is to put in my >>> agents an internal method unknown by the system. shortly after >>> saying >>> the method is unknow I can steps starting.... >>> Is there a safer boostrap ? >>> >>> Thanks >>> >>> Rafael H Bordini a écrit : >>> >>>> Hi Sebastien, >>>> >>>> There are various ways of doing this. The environment is a >>>> possibility >>>> but then you'd make sure you never remove that percept (if the >>>> percept >>>> is no longer in the environment, Jason will automatically remove >>>> the >>>> belief when the agent does belief update). >>>> >>>> There is an agent initialisation method that you could use if you >>>> customise the agent architecture but there's no need to go into >>>> that >>>> level of programming, you can keep it all in the AgentSpeak side. >>>> Try >>>> something like (not tested): >>>> >>>> // initial goal to initialise itself >>>> !init. >>>> >>>> // when having this goal, add the skill belief >>>> +!init <- .random(N); +skill(N). >>>> >>>> If you have more common parts to all agents, you can also put that >>>> bit >>>> of code in a file and use the Include directive in all agents, to >>>> avoid copying the same code and having to change each agent if >>>> there >>>> are changes in that code. >>>> >>>> HTH, >>>> >>>> Rafael >>>> >>>> >>>> Sebastien Mordelet wrote: >>>> >>>>> Hello >>>>> >>>>> I am currently working on a project where I need BDI agents.So I >>>>> came >>>>> to JASON which is totally new to me. >>>>> >>>>> I am trying to have all my agents initialized with a belief : >>>>> skill(n), where n is a random. >>>>> >>>>> Where I can do that? In the environeement ? >>>>> And if yes, is there any method used to initialize agents ?I >>>>> didn't >>>>> find things like that in the doc.. >>>>> >>>>> thanks a lot >>>>> >>>>> ------------------------------------------------------------------------------ >>>>> >>>>> Enter the BlackBerry Developer Challenge This is your chance to >>>>> win >>>>> up to $100,000 in prizes! For a limited time, vendors submitting >>>>> new >>>>> applications to BlackBerry App World(TM) will have >>>>> the opportunity to enter the BlackBerry Developer Challenge. See >>>>> full >>>>> prize details at: http://p.sf.net/sfu/Challenge >>>>> _______________________________________________ >>>>> Jason-users mailing list >>>>> Jas...@li... >>>>> https://lists.sourceforge.net/lists/listinfo/jason-users >>>>> >>> ------------------------------------------------------------------------------ >>> Enter the BlackBerry Developer Challenge >>> This is your chance to win up to $100,000 in prizes! For a limited >>> time, >>> vendors submitting new applications to BlackBerry App World(TM) will >>> have >>> the opportunity to enter the BlackBerry Developer Challenge. See >>> full prize >>> details at: http://p.sf.net/sfu/Challenge >>> _______________________________________________ >>> Jason-users mailing list >>> Jas...@li... >>> https://lists.sourceforge.net/lists/listinfo/jason-users >>> >> >> > > ------------------------------------------------------------------------------ > Enter the BlackBerry Developer Challenge > This is your chance to win up to $100,000 in prizes! For a limited > time, > vendors submitting new applications to BlackBerry App World(TM) will > have > the opportunity to enter the BlackBerry Developer Challenge. See > full prize > details at: http://p.sf.net/sfu/Challenge > _______________________________________________ > Jason-users mailing list > Jas...@li... > https://lists.sourceforge.net/lists/listinfo/jason-users -- Jomi Fred Hubner ENS Mines Saint-Etienne 158 Cours Fauriel 42023 Saint-Etienne Cedex 02 France http://www.emse.fr/~hubner  | 
| 
     
      
      
      From: Sebastien M. <se...@gm...> - 2009-07-15 20:41:08
      
     
   | 
Thanks again ! I can't figure out how you manage to stay on the cutting edge the way you do ! I did exactly what you did. Steps started but it seems to me that agents are living their life without observing any rules (1 step = 1 action). I ran my system several times, and I obtained different reponses : some times agents did almost all their actions without observing the kind of pause I suppose they should, some times they didn't. So, when I look at the game-of-life example, it seems to me that all the actions are defined in the environnement. Is there a kind of link between that and my problem ? Do the steps only concern the action defined in the environnement ? Thanks Jomi Hubner a écrit : > Hi Sebastien, > > you are right, the timeout for cycles starts counting only after the > first action performed by an agent. > > this problem will be solved in the next release of Jason, by now you > may include a 'fake' action in some agent: > > !startCycle. > +!startCycle <- donothing. > > HTH, > > Jomi > > On Jul 15, 2009, at 10:03 AM, Sebastien Mordelet wrote: > > >> Thanks a lot. You saved me a lot of hours... I was searching my way >> since yesterday afternoon.... >> >> May be I can push an other point : >> I am using a timestepped environnement. And the fact is that when I >> start the simulation, after agents are initialized nothing >> happens...no >> first step. The only way I found to make it starts is to put in my >> agents an internal method unknown by the system. shortly after saying >> the method is unknow I can steps starting.... >> Is there a safer boostrap ? >> >> Thanks >> >> Rafael H Bordini a écrit : >> >>> Hi Sebastien, >>> >>> There are various ways of doing this. The environment is a >>> possibility >>> but then you'd make sure you never remove that percept (if the >>> percept >>> is no longer in the environment, Jason will automatically remove the >>> belief when the agent does belief update). >>> >>> There is an agent initialisation method that you could use if you >>> customise the agent architecture but there's no need to go into that >>> level of programming, you can keep it all in the AgentSpeak side. Try >>> something like (not tested): >>> >>> // initial goal to initialise itself >>> !init. >>> >>> // when having this goal, add the skill belief >>> +!init <- .random(N); +skill(N). >>> >>> If you have more common parts to all agents, you can also put that >>> bit >>> of code in a file and use the Include directive in all agents, to >>> avoid copying the same code and having to change each agent if there >>> are changes in that code. >>> >>> HTH, >>> >>> Rafael >>> >>> >>> Sebastien Mordelet wrote: >>> >>>> Hello >>>> >>>> I am currently working on a project where I need BDI agents.So I >>>> came >>>> to JASON which is totally new to me. >>>> >>>> I am trying to have all my agents initialized with a belief : >>>> skill(n), where n is a random. >>>> >>>> Where I can do that? In the environeement ? >>>> And if yes, is there any method used to initialize agents ?I didn't >>>> find things like that in the doc.. >>>> >>>> thanks a lot >>>> >>>> ------------------------------------------------------------------------------ >>>> >>>> Enter the BlackBerry Developer Challenge This is your chance to win >>>> up to $100,000 in prizes! For a limited time, vendors submitting new >>>> applications to BlackBerry App World(TM) will have >>>> the opportunity to enter the BlackBerry Developer Challenge. See >>>> full >>>> prize details at: http://p.sf.net/sfu/Challenge >>>> _______________________________________________ >>>> Jason-users mailing list >>>> Jas...@li... >>>> https://lists.sourceforge.net/lists/listinfo/jason-users >>>> >> ------------------------------------------------------------------------------ >> Enter the BlackBerry Developer Challenge >> This is your chance to win up to $100,000 in prizes! For a limited >> time, >> vendors submitting new applications to BlackBerry App World(TM) will >> have >> the opportunity to enter the BlackBerry Developer Challenge. See >> full prize >> details at: http://p.sf.net/sfu/Challenge >> _______________________________________________ >> Jason-users mailing list >> Jas...@li... >> https://lists.sourceforge.net/lists/listinfo/jason-users >> > >  | 
| 
     
      
      
      From: Jomi H. <jom...@gm...> - 2009-07-15 17:02:24
      
     
   | 
Hi Sebastien, you are right, the timeout for cycles starts counting only after the first action performed by an agent. this problem will be solved in the next release of Jason, by now you may include a 'fake' action in some agent: !startCycle. +!startCycle <- donothing. HTH, Jomi On Jul 15, 2009, at 10:03 AM, Sebastien Mordelet wrote: > Thanks a lot. You saved me a lot of hours... I was searching my way > since yesterday afternoon.... > > May be I can push an other point : > I am using a timestepped environnement. And the fact is that when I > start the simulation, after agents are initialized nothing > happens...no > first step. The only way I found to make it starts is to put in my > agents an internal method unknown by the system. shortly after saying > the method is unknow I can steps starting.... > Is there a safer boostrap ? > > Thanks > > Rafael H Bordini a écrit : >> Hi Sebastien, >> >> There are various ways of doing this. The environment is a >> possibility >> but then you'd make sure you never remove that percept (if the >> percept >> is no longer in the environment, Jason will automatically remove the >> belief when the agent does belief update). >> >> There is an agent initialisation method that you could use if you >> customise the agent architecture but there's no need to go into that >> level of programming, you can keep it all in the AgentSpeak side. Try >> something like (not tested): >> >> // initial goal to initialise itself >> !init. >> >> // when having this goal, add the skill belief >> +!init <- .random(N); +skill(N). >> >> If you have more common parts to all agents, you can also put that >> bit >> of code in a file and use the Include directive in all agents, to >> avoid copying the same code and having to change each agent if there >> are changes in that code. >> >> HTH, >> >> Rafael >> >> >> Sebastien Mordelet wrote: >>> Hello >>> >>> I am currently working on a project where I need BDI agents.So I >>> came >>> to JASON which is totally new to me. >>> >>> I am trying to have all my agents initialized with a belief : >>> skill(n), where n is a random. >>> >>> Where I can do that? In the environeement ? >>> And if yes, is there any method used to initialize agents ?I didn't >>> find things like that in the doc.. >>> >>> thanks a lot >>> >>> ------------------------------------------------------------------------------ >>> >>> Enter the BlackBerry Developer Challenge This is your chance to win >>> up to $100,000 in prizes! For a limited time, vendors submitting new >>> applications to BlackBerry App World(TM) will have >>> the opportunity to enter the BlackBerry Developer Challenge. See >>> full >>> prize details at: http://p.sf.net/sfu/Challenge >>> _______________________________________________ >>> Jason-users mailing list >>> Jas...@li... >>> https://lists.sourceforge.net/lists/listinfo/jason-users > > ------------------------------------------------------------------------------ > Enter the BlackBerry Developer Challenge > This is your chance to win up to $100,000 in prizes! For a limited > time, > vendors submitting new applications to BlackBerry App World(TM) will > have > the opportunity to enter the BlackBerry Developer Challenge. See > full prize > details at: http://p.sf.net/sfu/Challenge > _______________________________________________ > Jason-users mailing list > Jas...@li... > https://lists.sourceforge.net/lists/listinfo/jason-users -- Jomi Fred Hubner ENS Mines Saint-Etienne 158 Cours Fauriel 42023 Saint-Etienne Cedex 02 France http://www.emse.fr/~hubner  | 
| 
     
      
      
      From: Sebastien M. <se...@gm...> - 2009-07-15 13:03:58
      
     
   | 
Thanks a lot. You saved me a lot of hours... I was searching my way since yesterday afternoon.... May be I can push an other point : I am using a timestepped environnement. And the fact is that when I start the simulation, after agents are initialized nothing happens...no first step. The only way I found to make it starts is to put in my agents an internal method unknown by the system. shortly after saying the method is unknow I can steps starting.... Is there a safer boostrap ? Thanks Rafael H Bordini a écrit : > Hi Sebastien, > > There are various ways of doing this. The environment is a possibility > but then you'd make sure you never remove that percept (if the percept > is no longer in the environment, Jason will automatically remove the > belief when the agent does belief update). > > There is an agent initialisation method that you could use if you > customise the agent architecture but there's no need to go into that > level of programming, you can keep it all in the AgentSpeak side. Try > something like (not tested): > > // initial goal to initialise itself > !init. > > // when having this goal, add the skill belief > +!init <- .random(N); +skill(N). > > If you have more common parts to all agents, you can also put that bit > of code in a file and use the Include directive in all agents, to > avoid copying the same code and having to change each agent if there > are changes in that code. > > HTH, > > Rafael > > > Sebastien Mordelet wrote: >> Hello >> >> I am currently working on a project where I need BDI agents.So I came >> to JASON which is totally new to me. >> >> I am trying to have all my agents initialized with a belief : >> skill(n), where n is a random. >> >> Where I can do that? In the environeement ? >> And if yes, is there any method used to initialize agents ?I didn't >> find things like that in the doc.. >> >> thanks a lot >> >> ------------------------------------------------------------------------------ >> >> Enter the BlackBerry Developer Challenge This is your chance to win >> up to $100,000 in prizes! For a limited time, vendors submitting new >> applications to BlackBerry App World(TM) will have >> the opportunity to enter the BlackBerry Developer Challenge. See full >> prize details at: http://p.sf.net/sfu/Challenge >> _______________________________________________ >> Jason-users mailing list >> Jas...@li... >> https://lists.sourceforge.net/lists/listinfo/jason-users  | 
| 
     
      
      
      From: Rafael H B. <r.b...@ac...> - 2009-07-15 12:46:16
      
     
   | 
Hi Sebastien, There are various ways of doing this. The environment is a possibility but then you'd make sure you never remove that percept (if the percept is no longer in the environment, Jason will automatically remove the belief when the agent does belief update). There is an agent initialisation method that you could use if you customise the agent architecture but there's no need to go into that level of programming, you can keep it all in the AgentSpeak side. Try something like (not tested): // initial goal to initialise itself !init. // when having this goal, add the skill belief +!init <- .random(N); +skill(N). If you have more common parts to all agents, you can also put that bit of code in a file and use the Include directive in all agents, to avoid copying the same code and having to change each agent if there are changes in that code. HTH, Rafael Sebastien Mordelet wrote: > Hello > > I am currently working on a project where I need BDI agents.So I came to > JASON which is totally new to me. > > I am trying to have all my agents initialized with a belief : skill(n), > where n is a random. > > Where I can do that? In the environeement ? > And if yes, is there any method used to initialize agents ?I didn't find > things like that in the doc.. > > thanks a lot > > ------------------------------------------------------------------------------ > Enter the BlackBerry Developer Challenge > This is your chance to win up to $100,000 in prizes! For a limited time, > vendors submitting new applications to BlackBerry App World(TM) will have > the opportunity to enter the BlackBerry Developer Challenge. See full prize > details at: http://p.sf.net/sfu/Challenge > _______________________________________________ > Jason-users mailing list > Jas...@li... > https://lists.sourceforge.net/lists/listinfo/jason-users  | 
| 
     
      
      
      From: Sebastien M. <se...@gm...> - 2009-07-15 12:28:28
      
     
   | 
Hello I am currently working on a project where I need BDI agents.So I came to JASON which is totally new to me. I am trying to have all my agents initialized with a belief : skill(n), where n is a random. Where I can do that? In the environeement ? And if yes, is there any method used to initialize agents ?I didn't find things like that in the doc.. thanks a lot  |