Click here to Skip to main content
15,893,588 members
Please Sign up or sign in to vote.
1.00/5 (2 votes)
See more:
Please i have to solve SchoolsGirl problem using Simulated Annealing using CSharp.

Can any please help or give help on how to solve this?

Thanks.
Posted
Comments
Philip Stuyck 11-Aug-12 13:18pm    
ref from wikipedia :
Kirkman's schoolgirl problem is a problem in combinatorics proposed by Rev. Thomas Penyngton Kirkman in 1850 as Query VI in The Lady's and Gentleman's Diary (pg.48). The problem states:


Fifteen young ladies in a school walk out three abreast for seven days in succession: it is required to arrange them daily so that no two shall walk twice abreast.[1]

Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of locating a good approximation to the global optimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more efficient than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution.

The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one.

By analogy with this physical process, each step of the SA algorithm attempts to replace the current solution by a random solution (chosen according to a candidate distribution, often constructed to sample from solutions near the current solution). The new solution may then be accepted with a probability that depends both on the difference between the corresponding function values and also on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the choice between the previous and current solution is almost random when T is large, but increasingly selects the better or "downhill" solution (for a minimization problem) as T goes to zero. The allowance for "uphill" moves potentially saves the method from becoming stuck at local optima—which are the bane of greedier methods.

The method was independently described by Scott Kirkpatrick, C. Daniel Gelatt and Mario P. Vecchi in 1983,[1] and by Vlado Černý in 1985.[2] The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by M.N. Rosenbluth in a paper by N. Metropolis et al. in 1953.[3]
Philip Stuyck 11-Aug-12 13:28pm    
This is homework right ?
Volynsky Alex 11-Aug-12 15:15pm    
I hope that will help you:
http://msdn.microsoft.com/en-us/magazine/hh708758.aspx
http://www.heatonresearch.com/online/introduction-neural-networks-cs-edition-2/chapter-7

1 solution

 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900