There is no absolute. It is a ratio between the physical size of the marker, and the resolution of the camera image. It is a tradeoff: the larger the camera image, the slower it is running. The larger the camera image, the smaller the marker can be.
To run AR.js locally on your computer, first clone a copy of the repository, and change to the AR.js
folder:
git clone git@github.com:jeromeetienne/AR.js.git
cd AR.js
After that, serve the files using a static http server. I use a simple command line http server called http-server
.
This can be installed using npm:
npm install -g http-server
to start the http-server, simply run:
http-server
On mobile, accessing the camera using getUserMedia
requires that you have a secure HTTPS connection to the server. To do this, you will need to generate a certificate by running:
openssl req -newkey rsa:2048 -new -nodes -x509 -days 3650 -keyout key.pem -out cert.pem
This will generate two files: key.pem
and cert.pem
.
You then run the server with the -S
to enable SSL and -C
for your certificate files:
http-server -S -C cert.pem -o
Alternatively, you can deploy to github pages which by default, is served using HTTPS. This avoids having to configure a SSL server.
Also working from localhost, you can avoid having to use HTTPS since localhost is assumed secured.
Thanks to @mritzco for configuration directions.
it works on any browser with WebRTC and WebGL. It run on platforms: android, IOS and window mobile.
As experimentation, it has been on htc vive by @robenghuse and hololens by @evhan55. What is interested is that i didn’t change a line to run on those devices. AR.js ran out of the box directly on htv vive and Hololens, i think it is a clear proof of the web being true cross-platform.