Welcome to our site!

Main Menu

The Murder Gene

Started by antediluvian, February 16, 2013, 09:46:27 PM

Previous topic - Next topic


Quote from: "drunkenshoe"I find the term 'genius' very nonsense, I refuse to recognise it, esp. because of the way it is generally seen today.
I agree.  At the very least, "genius" as an IQ level is a horrible definition.  The word is rooted in the Latin for "to bring into being, to produce."  A very creative thinker you can argue is introducing new CONNECTIONS between ideas into the world, or interesting new angles on the interpretation of a musical piece, and a physical scientist may actually create something that the world has never seen.  Those are actual geniuses.

But someone who can rotate blocks mentally, or remember long strings of numbers, or track 9 items rather than 6, shouldn't really be called a genius.  Many very clever people produce very little.
Insanity is the only sensible response to the universe.  The sane are just making stuff up.


Quote from: "drunkenshoe"You mentioned somewhere that you have seen researches that related IQ to success despite some poeple made statements against it. I made that statement, because reading Malcolm Gladwell's 'Outliers' made me see what we call 'success' very differently.

The correlation depends on the VOI, and how we operationally define success. I'm not as up on the correlational results because I honestly prefer experimental psychology.

QuoteI can also tell you that from my field, all those historical characters people are programmed to see as 'Great Geniuses of All Time' are just normal, curious somewhat talented people born at the 'right' time/place, had the right relations, most are made in to cults much later, some are promoted as a token of national identity in an unconscious process of making a culture and a few are self promotions. Most of them were unbelivingly religious and dumb.

I find the term 'genius' very nonsense, I refuse to recognise it, esp. because of the way it is generally seen today. I can give you a lot of examples and explain why the terms exists in terms of my field, but I don't have the time now and I don't even know if you like to talk about this. Hope you do.

I don't have strong opinions about "genius" either way, if only because it's a word. You could reserve genius as an IQ result and savant in the colloquial and it wouldn't change much, at least to me.

It just seems to me a result of offense and personal definitions that people don't like the tests. Then again, I default to the old video-game journalism idea of that, if you like something, it shouldn't matter what other people think. I'm fairly certain I'm of average if not slightly below average IQ and I like to spend my time playing video-games and reading about biology/physics. Whether I call someone a genius or whether or not I am does not infringe on my ability to enjoy my morning coffee. Like Einstein said, "Everyone is a genius, but if you judge a fish by how well it can climb a tree it will go its whole life believing it's stupid."

IQ testing has its place, and is useful for certain things. Do I think it's an all encompassing tool? No, but it doesn't claim to be either. It attempts to define a concept, and quantify it. In that regard, I think it's successful. I just think we grow attached to conceptual definitions.
Quote from: \"azmhyr\"
Quote from: \"quoting\"New Testament doesn\'t Justify the banning of gays from anywhere.
Well, the old testament permabans them from life tho.


They are doing a series ob PBS called "After Newtown." This segment explores what you are speaking about. ... iller.html
???  ??


Quote from: "drunkenshoe"Pilgrim,

I get little of what you are talking about comfort zone, I need to look up words and read about it to understand about what you are exactly doing. I need to learn about the mechanics of reading a response and understanding about how consciousness of a brain can be measured by doing what.

It's difficult stuff. I can only give you the absolute basics of EEG, sadly. There is a professor, however, I'd really like to work with who could apparently turn me into an expert. Which would be rad.

QuoteWell I didn't even know about her, but when I used the word collective memory -incorrectly, mostly because what to sue in English to describe what to say-  in one of my posts, this is what I mean. That's also why I jumped to 'We can't trust humans' cliché. I never thought of an experiment with any other animal. I was thinking of experiments where we directly gain information from the subject about what she/he feels about what. And I wasn't even thinking about anything in a  neurological level.

I don't like to divorce the two processes, but that's because I'm an advocate for cognitive neuroscience. =P

The thing is, we like to think of the brain-behavior link as instantaneous; we understand, evidenciarily, that there are separate areas of the brain that link to different things but we fail to integrate that understanding sufficiently, and consider the emergent faculties of the human brain as separate from its processes. The fact of the matter is that we have numerous projections related to what some would call reflexes that activate before we can consciously inhibit them. It's why you jump at a harmless sound, and only after appraise the situation.

QuoteBear with me. I am talking about how humans start to accumulate information about the world and other humans from the moment they were born. They shape this information according to their own personalities when they are developing, while at the same time being developed by that information. There are people who are afraid of cats for example. Some of them remember that when they are little they were scracthed or bitten by one. Some doesn't remember that but they find little furry animals dangerous and they don't like them at all. This is a simple example but there are more complex situations where humans develop defence mechanisms THAT becomes in time a personality trait. For example a parent telling a child she cannot do something, because she can't. Could be for any reason, for temporary control so she wouldn't hurt herself, could be because parent was abused as a minor the same way...blah blah It doesn't have to be negative or traumatic either. Humans are walking defence mechanisms. They will always try to modify some 'negatives' parts of themselves that they 'learned' as 'unwanted', 'uncool', 'not to be desired', 'out of fashion' and they will make themselves believe in that the don't posess those traits. With mice this is out of question.

Different theorists would take your paragraph different ways. Strict behaviorists would say we are just the result of learning sequential orders of reinforcements and of association between stimuli. It's why some people have an aversion to cats: UCS + CS --> UCR + CR

The problem with your child example is that they often don't conform to that idea. A parent might say, "no" and a child will say, okay...until the parent leaves =P

I don't think "defense mechanisms" are the right word here.

What do you mean this is out of the question with mice? Mice - and other animals - do the exact same thing. That's the nature of dog-training.

QuoteIBut the idea that our 'smarts' might be related to our hands, does NOT rely on how many fingers we have. It's about the size of our 'thumbs' in respect of our other fingers and other primates' thumbs respect to their finger size. If you look from this angle, our thumbs are 'overdeveloped'. And that might have helped the evolution of our smarts. I talked about this with anthropologists. One of them said this cannot be refuted nor proved which I get very well.

I know about Dawkins's anachronist example, I also get what he is trying to explain there. But developing a technology has so many other dynamics behind it, evolution of science; countless other things manipulating the scientific tradition and accumulation of knowledge arriving a certain outcome. So it is not a good anology in my opnion. That's a general problem with anachronistic anologies. Also with fantastical anologies. It doesn't arrive anywhere, but you can talk about it for hours.

First of all the anology dismisses a historical fact -we know for sure- that there are periods where 'knowledge was forgottten' under certain circumstances in history. Religion, wars... Discovery of 0, that world has spherical shape...had to be discovered all over 'again'. So making speculations like what would happen 200 years ago, if something was slightly different, does not really mean much. Variables are virtually astronomical. There are also accidental discoveries which don't rely on scientific intuition or any method. Discovery of the basic understand leading to 'take a photographic image' for example is purely accidental.

(I could claim that lots of things would be different if Ancient Greeks didn't build a colony called Magna Greacia in Italy. I could discuss it by following the whole development of the 'tradition' they created there along 2500 years of Western History and explain if that didn't happen Western culture wouldn't be anywhere near where it is now. But that doesn't mean anythying.)

I'm aware the idea is flawed; it's just an interesting hypothesis =P
Quote from: \"azmhyr\"
Quote from: \"quoting\"New Testament doesn\'t Justify the banning of gays from anywhere.
Well, the old testament permabans them from life tho.


Quote from: "drunkenshoe"So in layman terms, this is happening because brain is not actually working like a computer at all. Because it is not actually 'wired' in a certain way, but of course there are some common 'points'. I started reading Norman Doidge's 'The Brain that Changes Itself' but couldn't find the time to finish it. He is talking about some old conventions about human brain which turned out to be wrong in recent studies. But this might completely be another thing, probably my mind just summons that little info.

The computer metaphor/analogy is one of the most common ones used in any science, because most people who hear it are familiar with how a computer works at the most basic level. I don't like it, because it simplifies. It's great for modelling, but lousy for teaching.

Our neurons make numerous connections when we are first born (more developing past infancy but bear with me). But we don't end up using them so we prune them back and remove them. Our brains are constantly engaged on multiple systems, and our use of certain information regulates whether or not we keep certain connections or not. It's not an "input-ouput" scenario, because there's so much going on. However, we perceive an input-output scenario.

It's the same problem I have with using programming language for DNA, or saying "DNA is the blueprint." It's not. DNA codes for proteins. We like to think of DNA as blueprints for a house, when really all DNA does is create a hammer, a window, some lumber, and a construction worker. The end result of house is a byproduct of those things, but it's not specifically what's coded for.

QuoteWhen I said that about mice, I was thinking about the cognitive experiments you people do with humans and any other possible experiments done with animals like mice. They cannot be compared as in classical method. the response you get from a mouse and what you get from a human. I was again thinking about any kind of experiment that would include verbal or written information given by the subject, but I guess this is the mistake I am making generally in thinking about experiments you need to do. When psychology gets in, I automatically think about some sort of a 'converstaion', 'exchange of information'. How mch a mouse is effected by being observed and how much a human does? There must be an unreachable gap there. I need to read something written for layman about that to be informed. Where your brain has 'highways' about this, mine doesn't even have little paths, information doesn't travel through to needed stations.  :wink:  

I inherently dislike verbal/written (read: subjective) data, if only because it's subject to many more biases than straight up physiological/objective data. But again, I like experiments.

Quote:roll: It's not a hypothesis. It is an anachronistic anology which took me 20 seconds to refute.

Fair enough
Quote from: \"azmhyr\"
Quote from: \"quoting\"New Testament doesn\'t Justify the banning of gays from anywhere.
Well, the old testament permabans them from life tho.