[ml] [Noisebridge-discuss] I am interested in starting an optimization group at noisebridge (BetterBridge? TrollSearch?)

Greg Hornby ghornby at gmail.com
Mon May 30 15:32:36 UTC 2011


Some of the better Black Box Optimizers use some form of Statistical Machine
Learning.   For example look at the winners of GECCO's (one of the big
Evolutionary Algorithm Conference's) Black Box Optimization workshop last
year:
http://coco.gforge.inria.fr/doku.php?id=bbob-2010
http://coco.gforge.inria.fr/doku.php?id=bbob-2009-results

One of the best algorithms is the Covariance Matrix Adaptation Evolutionary
Strategy (CMA-ES):
  http://www.lri.fr/~hansen/cmaesintro.html
This approach has been in development for a while and because of its success
is inspiring new EA - SML hybrid systems.

Greg



On Sun, May 29, 2011 at 10:05 AM, Crutcher Dunnavant <crutcher at gmail.com>wrote:

>    Here's a class of problems I'm interested in.
>
> Given a function f:a->b, and a known y in b, find an x in a such that
> f(x)~=y.
>
> This problem is only solvable if f has certain characteristics, but it has
> many very exciting applications. We can search for the muscle movements
> underlying a speech act, the fingerings behind a piece of music, and the
> texture on the set of an old tv show.
>
> Calling this machine learning is a bit of a stretch, but sure, we can talk
> about it. Just don't expect to much rigor from me yet.
>
> On Sat, May 28, 2011 at 7:32 PM, Brian Morris <cymraegish at gmail.com>wrote:
>
>> I apologize that perhaps I was all over the place before in this thread.
>> As far as the algorithms and projects go I am very interested in anything
>> with longer term really research stuff especially if it might lead to
>> publication. I'll try my best to get to ML meeting this week if I can.
>>
>> My interest in ML did indeed evolve from Linuistics/CogSci/Philosopy;
>> however my formal training was in Math/NumericalScientificComputing. Just I
>> got away from the latter for some years and coming back in reading a lot got
>> into the former by self-study. Gradually got more into Computational
>> linguistics and then out again and exploring some real world problems
>> interest in the ML group. My more recent math interests mostly in Logic/
>> reasoning models.
>>
>> The wiki page for ML at Noisebridge lists a ML video course from Stanford
>> ( see https://www.noisebridge.net/wiki/NBML_Course) , but last week I was
>> indeed looking at ESL text which Mike included in his refs for random
>> forests (the first presentation he gave before last week).
>>
>> My web scraping by hand led me to two or three groups, in the east and
>> midwest, some connections way back MIT and Yale but more recently Mich,
>> Indiana and CMU (Europy: a) maybe some Edinburgh school as well ? Used the
>> term ML back in the late 60's to distinguish them from AI workers b) Luc
>> Steels' work on evolution of language with Alife and Robotics). Most
>> striking to me this particular HRI group
>>
>> http://hrilab.cs.tufts.edu/research/
>>
>>
>>  and especially impressed by perspective of the CMU Machine Learning dept
>> leader Tom Mitchell:
>>
>> Book:
>> www.cs.cmu.edu/~*tom*/mlbook.html
>> Interview on his perspective:
>> videolectures.net/mlas06_*mitchell*_itm/
>>
>>
>>
>>
>>
>>
>>
>>  On Sat, May 28, 2011 at 4:56 PM, Kai Chang <kai.salmon.chang at gmail.com>wrote:
>>
>>> I did cognitive science at UVA, and there the approach was quite
>>> different. The curriculum seemed heavily influenced by MIT, with more focus
>>> on linguistics, pyschology and philosophy. Not sure what the approach at CMU
>>> is like.
>>>
>>> At Stanford, the emphasis seems to be on optimization, classification,
>>> etc. The problem requires a fitness function or other well-defined metrics
>>> to test against. More related to math and statistics than cognition in
>>> general.
>>>
>>> For broader questions of cognition and learning, there are several other
>>> traditions. Logicians and linguists: Wittgenstein, Godel and Saussure.
>>> Phenomenologists: Heidegger, Merleau-Ponty, Marx. Philosophers: Hume,
>>> Nietzsche, Deleuze. Buddhism also has very nuanced views on the subject.
>>>
>>> These traditions share doubts about symbolic formalism and rationality as
>>> sufficient means of describing human intelligence and experience.
>>>
>>> For example, we have overwhelming evidence that neurons serve as building
>>> blocks in networks with the capability to learn broad classes of problems.
>>> But we still have no foundation to demonstrate that neurons produce our
>>> phenomenal experience (the rich, inner, subjective world).
>>>
>>> Anyways. Crutcher, I'll drop by next week. I got really busy last time ML
>>> was doing a group project. I'd be interested in hearing about these other
>>> applications of metaheuristic search too!
>>>
>>> On Fri, May 27, 2011 at 8:14 PM, David Faden <dfaden at gmail.com> wrote:
>>>
>>>> West = http://www-stat.stanford.edu/~tibs/ElemStatLearn/ ?
>>>>
>>>> What's the East Coast book?
>>>>
>>>> How would you classify http://aima.cs.berkeley.edu/ ?
>>>>
>>>> Interesting to hear about this division. Thanks.
>>>>
>>>> On Fri, May 27, 2011 at 7:03 PM, Brian Morris <cymraegish at gmail.com>wrote:
>>>>
>>>>> I am very much wanting the more doing / project as-a-group thing.
>>>>>
>>>>> I probably won't make it to the ML meeting this coming week tho (most
>>>>> 3/4 weeks I am there)
>>>>>
>>>>> There are two perspectives of what Machine Learning is, might call for
>>>>> lack of better terms the Stanford perspective and the Carnegie-Mellon
>>>>> perspective (from the location of the authors of two popular texts), or East
>>>>> and West if you like.
>>>>>
>>>>> I generally take the East side and Mike the west,  but also would like
>>>>> to work on more general problems with a group, maybe  less Statistical
>>>>> Learning ... and / or problems which have orderings rather than actual or
>>>>> precise numerical values (ordinal variables), so that optimization is
>>>>> possible even if formulas cannot be given or numerical data is either too
>>>>> imprecise or simply unavailable. [Like if you have labels like letter
>>>>> grades, but not numerically based, how best to assign / design content ?]
>>>>>
>>>>> Maybe applies to some problems in knowledge management, natural
>>>>> language / linguistics problems, policy decision making (how to maximize job
>>>>> satisfaction for instance), intelligence analysis.
>>>>>
>>>>> Kinda short on specific problems, might have to Troll Search them.
>>>>>
>>>>> By the way, what's the best way to produce a table of contents rather
>>>>> than an index (given say a bunch of text scraped off the Web) ?
>>>>>
>>>>> On Fri, May 27, 2011 at 1:27 PM, Crutcher Dunnavant <
>>>>> crutcher at gmail.com> wrote:
>>>>>
>>>>>> The ML group seems to have grown up quite a bit since the last time I
>>>>>> paid attention; and I think I should start participating; as the page lists
>>>>>> many things I'd like to learn and play with.
>>>>>>
>>>>>> I am specifically suggesting a group-project oriented group; rather
>>>>>> than a research group or class. Something that would yield finished
>>>>>> projects; something where we collaborate on a common code base and problem
>>>>>> set, beat it to death, publish it (5mof?); and move on to the next one.
>>>>>>
>>>>>> There are many applications of metaheuristic search outside machine
>>>>>> learning; and I don't want to hijack a group which looks healthy.
>>>>>>
>>>>>> On Fri, May 27, 2011 at 12:00 PM, Mike Schachter <mike at mindmech.com>wrote:
>>>>>>
>>>>>>> Hi Crutcher,
>>>>>>>
>>>>>>> I'd be interested in black box optimization. The machine learning
>>>>>>> group meets up on Wednesdays at 7:30pm in the Church classroom:
>>>>>>>
>>>>>>> https://www.noisebridge.net/index.php?title=Machine_Learning
>>>>>>>
>>>>>>> Just speaking for myself, I'd be happy to see you share time/space
>>>>>>> with the ML group to talk about optimization, as it's a core part of
>>>>>>> machine learning.
>>>>>>>
>>>>>>> We don't have anything going on next week, and you're welcome to
>>>>>>> come in to talk about stuff, I'd be happy to discuss optimization
>>>>>>> with
>>>>>>> you!
>>>>>>>
>>>>>>>  mike
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, May 27, 2011 at 10:10 AM, Crutcher Dunnavant <
>>>>>>> crutcher at gmail.com> wrote:
>>>>>>> > I am very interested in starting a black box optimization search
>>>>>>> group at
>>>>>>> > noisebridge. This field is called "metaheauristics"; but the name
>>>>>>> is a
>>>>>>> > stupid historical artifact (so says everyone in the field).
>>>>>>> > Optimization is, given a function f(x), searching for the x which
>>>>>>> yields the
>>>>>>> > best f(x). Black box optimization is a sub-field of optimization
>>>>>>> where you
>>>>>>> > can't analyize the function f to determine what values of x are
>>>>>>> likely to be
>>>>>>> > good; so you have to search the space for them.
>>>>>>> > The following algorithms are ALL metaheuristic optimization:
>>>>>>> > Hill Climbing (aka. Gradient Assent/Descent)
>>>>>>> > Genetic Search
>>>>>>> > Genetic Programming
>>>>>>> > Ant Colony Systems
>>>>>>> > Particle Swarm Optimization
>>>>>>> > I've recently read a fabulous undergraduate text on the subject,
>>>>>>> very
>>>>>>> > approachable, called "Essentials of Metaheuristics".
>>>>>>> > The book in question is available from Lulu and Amazaon:
>>>>>>> > http://www.cs.gmu.edu/~sean/book/metaheuristics/
>>>>>>> > or you can just download the PDF.
>>>>>>> > http://www.cs.gmu.edu/~sean/book/metaheuristics/Essentials.pdf
>>>>>>> >
>>>>>>> > If you aren't sure what I'm talking about, read the first chapter
>>>>>>> or two. If
>>>>>>> > you have a background in programming, you should be able to follow
>>>>>>> it
>>>>>>> > trivially.
>>>>>>> > What I want TrollSearch to do: Build Shit
>>>>>>> > Let's find interesting problems; and build search algorithms over
>>>>>>> them. This
>>>>>>> > can apply to evolving good fit 3d models for the printer; making
>>>>>>> techno; or
>>>>>>> > identifying penii.
>>>>>>> > I'd like TrollSearch to look much more like SpaceBridge than like
>>>>>>> the Python
>>>>>>> > Class.
>>>>>>> > Please comment in-thread if you are interested.
>>>>>>> > --
>>>>>>> > Crutcher Dunnavant <crutcher at gmail.com>
>>>>>>> >
>>>>>>> > _______________________________________________
>>>>>>> > Noisebridge-discuss mailing list
>>>>>>> > Noisebridge-discuss at lists.noisebridge.net
>>>>>>> > https://www.noisebridge.net/mailman/listinfo/noisebridge-discuss
>>>>>>> >
>>>>>>> >
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Crutcher Dunnavant <crutcher at gmail.com>
>>>>>>
>>>>>> _______________________________________________
>>>>>> Noisebridge-discuss mailing list
>>>>>> Noisebridge-discuss at lists.noisebridge.net
>>>>>> https://www.noisebridge.net/mailman/listinfo/noisebridge-discuss
>>>>>>
>>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> ml mailing list
>>>>> ml at lists.noisebridge.net
>>>>> https://www.noisebridge.net/mailman/listinfo/ml
>>>>>
>>>>>
>>>>
>>>> _______________________________________________
>>>> ml mailing list
>>>> ml at lists.noisebridge.net
>>>> https://www.noisebridge.net/mailman/listinfo/ml
>>>>
>>>>
>>>
>>
>
>
> --
> Crutcher Dunnavant <crutcher at gmail.com>
>
> _______________________________________________
> ml mailing list
> ml at lists.noisebridge.net
> https://www.noisebridge.net/mailman/listinfo/ml
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.noisebridge.net/pipermail/ml/attachments/20110530/535c20ca/attachment-0003.html>


More information about the ml mailing list