Augmented reality for a-frame.

Marker Based


Here are the attributes for this entity

Attribute Description Component Mapping
type type of marker - [‘pattern’, ‘barcode’, ‘unknown’ ] artoolkitmarker.type
size size of the marker in meter artoolkitmarker.size
url url of the pattern - IIF type=’pattern’ artoolkitmarker.patternUrl
value value of the barcode - IIF type=’barcode’ artoolkitmarker.barcodeValue
preset parameters preset - [‘hiro’, ‘kanji’] artoolkitmarker.preset
emitevents emits ‘markerFound’ and ‘markerLost’ events - [‘true’, ‘false’] -
smooth turn on/off camera smoothing - [‘true’, ‘false’] - default: false -
smoothCount number of matrices to smooth tracking over, more = smoother but slower follow - default: 5 -
smoothTolerance distance tolerance for smoothing, if smoothThreshold # of matrices are under tolerance, tracking will stay still - default: 0.01 -
smoothThreshold threshold for smoothing, will keep still unless enough matrices are over tolerance - default: 2 -


Usually the model used in augmented reality is about changing the modelViewMatrix based on the marker position. the camera being static in 0,0,0 looking toward negative z.

We define as well a model where we move the camera, instead of the object. It changes the camera transform matrix.

This cameraTransform mode seems more instinctive than the modelView mode. cameraTransform would fit well a room-scale setup, with multiple markers connected to each other. modelView is able to provide multiple independent markers.

        <!-- add artoolkit into your scene -->
        <a-scene artoolkit>
        <!-- define your scene as usual -->
        <!-- define a camera inside the <a-marker-camera> -->
        <a-marker-camera preset='hiro'><a-marker-camera>

Location Based


Required: yes Max allowed per scene: 1

This component enables the Location AR. It has to be added to the camera entity. It makes possible to handle both position and rotation of the camera and it’s used to determine where the user is pointing their device.

For example:

<a-camera gps-camera rotation-reader></a-camera>

In addition to that, as you can see on the example above, we also have to add rotation-reader to handle rotation events. See here for more details.


Property Description Default Value    
alert Whether to show a message when GPS signal is under the positionMinAccuracy false   true
positionMinAccuracy Minimum accuracy allowed for position signal 100    
minDistance If set, places with a distance from the user lower than this value, are not showed. Only a positive value is allowed. Value is in meters. 0 (disabled)    


Required: yes Max allowed per scene: no limit

This component makes every entity GPS-trackable. It assignes a specific world position to the entity, so the user can see it when their phone is pointing to its position in the real world. If user is far from the entity, their will see it smaller. If it is too far, their will not see it at all.

It requires latitude and longitude as a single string parameter (example with a-box aframe primitive):

<a-box color="yellow" gps-entity-place="latitude: <your-latitude>; longitude: <your-longitude>"/>


Required: no Max allowed per scene: 1

This component has to be added only in development environments, not production ones. It shows a debug UI with camera informations and a list of registered gps-entity-place entities, showing also distance from the user for each one.

It has to be added to the a-scene:

<a-scene gps-camera-debug embedded arjs='sourceType: webcam; debugUIEnabled: false;'></a-scene>

Location Based Support

Tried on Huawei P20, works like charm.

Works good also on iPhone 6.

On iOS, from 12.2, Motion sensors on Safari has be to activated from Settings. If not, GeoAR.js will prompt the user to do so. This may change with final release of iOS 13 but as September 2019 is not yet out.

We need a lot of more tests, but the first impression is: the more advanced the phone (so newer) the better. This because of better quality sensors.