A natural bias for the basic level

It is well established that people can categorize the same objects at different levels of abstraction (i.e., superordinate, basic, and subordinate). Of these, the basic level is known to have a privileged status that is often attributed to the organization of categories in memory. Here, we argue tha...

Full description

Bibliographic Details
Main Authors: Annie Archambault, Frédéric Gosselin
Other Authors: The Pennsylvania State University CiteSeerX Archives
Format: Text
Language:English
Published: 2000
Subjects:
Online Access:http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.494.4121
http://www.mapageweb.umontreal.ca/gosselif/labogo/ar_go_sc.pdf
id ftciteseerx:oai:CiteSeerX.psu:10.1.1.494.4121
record_format openpolar
spelling ftciteseerx:oai:CiteSeerX.psu:10.1.1.494.4121 2023-05-15T16:36:07+02:00 A natural bias for the basic level Annie Archambault Frédéric Gosselin The Pennsylvania State University CiteSeerX Archives 2000 application/pdf http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.494.4121 http://www.mapageweb.umontreal.ca/gosselif/labogo/ar_go_sc.pdf en eng http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.494.4121 http://www.mapageweb.umontreal.ca/gosselif/labogo/ar_go_sc.pdf Metadata may be used without restrictions as long as the oai identifier remains attached to it. http://www.mapageweb.umontreal.ca/gosselif/labogo/ar_go_sc.pdf text 2000 ftciteseerx 2016-01-08T08:41:36Z It is well established that people can categorize the same objects at different levels of abstraction (i.e., superordinate, basic, and subordinate). Of these, the basic level is known to have a privileged status that is often attributed to the organization of categories in memory. Here, we argue that the bias could in part arise from the image formation process itself—i.e., the object properties for categorization that arise from the 2D retinal projections of distal 3D objects. In the real world, people do categorize objects from a variety of viewing distances and these modify the availability of object information on the retina. In two experiments, we tested the hypothesis that the information for basic categorizations is more resistant to changes in viewing distance than that of subordinate categorizations. Casual observers would experience little difficulty to categorize the animals in Figure 1 as exemplars of dog and those of Figure 2 as exemplars of whale. If they were “experts”, they could categorize these animals as Saint-Bernard dog, Doberman dog, Sperm whale, and Humpback whale. People can similarly apply different levels of category abstraction to the 3D distal objects that impinge on their retina. Rosch et al.’s (1976) seminal research isolated three “natural ” levels of object categorization: the superordinate (animal, vehicle, furniture), the basic (dog, car, chair), and the subordinate (Saint-Bernard dog, Porsche, Chippendale chair). Of these, the basic and subordinate are thought to be closer to perception and we will focus on their main differences. The former level is superior to the later in a number of ways: (1) Categories at the basic-level are verified fastest (see Text Humpback Whale Sperm whale Unknown
institution Open Polar
collection Unknown
op_collection_id ftciteseerx
language English
description It is well established that people can categorize the same objects at different levels of abstraction (i.e., superordinate, basic, and subordinate). Of these, the basic level is known to have a privileged status that is often attributed to the organization of categories in memory. Here, we argue that the bias could in part arise from the image formation process itself—i.e., the object properties for categorization that arise from the 2D retinal projections of distal 3D objects. In the real world, people do categorize objects from a variety of viewing distances and these modify the availability of object information on the retina. In two experiments, we tested the hypothesis that the information for basic categorizations is more resistant to changes in viewing distance than that of subordinate categorizations. Casual observers would experience little difficulty to categorize the animals in Figure 1 as exemplars of dog and those of Figure 2 as exemplars of whale. If they were “experts”, they could categorize these animals as Saint-Bernard dog, Doberman dog, Sperm whale, and Humpback whale. People can similarly apply different levels of category abstraction to the 3D distal objects that impinge on their retina. Rosch et al.’s (1976) seminal research isolated three “natural ” levels of object categorization: the superordinate (animal, vehicle, furniture), the basic (dog, car, chair), and the subordinate (Saint-Bernard dog, Porsche, Chippendale chair). Of these, the basic and subordinate are thought to be closer to perception and we will focus on their main differences. The former level is superior to the later in a number of ways: (1) Categories at the basic-level are verified fastest (see
author2 The Pennsylvania State University CiteSeerX Archives
format Text
author Annie Archambault
Frédéric Gosselin
spellingShingle Annie Archambault
Frédéric Gosselin
A natural bias for the basic level
author_facet Annie Archambault
Frédéric Gosselin
author_sort Annie Archambault
title A natural bias for the basic level
title_short A natural bias for the basic level
title_full A natural bias for the basic level
title_fullStr A natural bias for the basic level
title_full_unstemmed A natural bias for the basic level
title_sort natural bias for the basic level
publishDate 2000
url http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.494.4121
http://www.mapageweb.umontreal.ca/gosselif/labogo/ar_go_sc.pdf
genre Humpback Whale
Sperm whale
genre_facet Humpback Whale
Sperm whale
op_source http://www.mapageweb.umontreal.ca/gosselif/labogo/ar_go_sc.pdf
op_relation http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.494.4121
http://www.mapageweb.umontreal.ca/gosselif/labogo/ar_go_sc.pdf
op_rights Metadata may be used without restrictions as long as the oai identifier remains attached to it.
_version_ 1766026420288487424