This reduces memory use by reusing objects, and improves identity/equality
relationships of JavaScript objects on the Python side.
In 77bd8fe5b8 PyProxy's were reused when the
same Python object was proxied across to JavaScript. This commit does the
same thing but for JsProxy's going from JS to Python. If an existing
JsProxy reference exists for the JS object about to be proxied across, then
it's reused.
This helps reduce the number of alive objects (memory use), and, more
importantly, improves equality relationships of JavaScript objects on the
Python side. Eg we now get, on the Python side:
import js
print(js.Object == js.Object)
that prints True. Previously it was False.
Note that this change does not make identity work with `is`, for example
`js.Object is js.Object` is actually False. With more work that could be
made True but for now we leave that as-is.
The behaviour with this commit matches Pyodide semantics.
Signed-off-by: Damien George <damien@micropython.org>
Fixes a bug in the binding of self/this to JavaScript methods.
The new semantics match Pyodide's behaviour, at least for the included
tests.
Signed-off-by: Damien George <damien@micropython.org>
This commit improves get handling by guarding against implicit unknown
symbols accessed directly by specific JS native APIs.
Fixes issue #17657.
Signed-off-by: Andrea Giammarchi <andrea.giammarchi@gmail.com>
JavaScript code uses "Symbol in object" to brand check its own proxies, and
such checks should also work on the Python side.
Signed-off-by: Andrea Giammarchi <andrea.giammarchi@gmail.com>
This commit renames the NORETURN macro, indicating to the compiler
that a function does not return, into MP_NORETURN to maintain the same
naming convention of other similar macros.
To maintain compaitiblity with existing code NORETURN is aliased to
MP_NORETURN, but it is also deprecated for MicroPython v2.
This changeset was created using a similar process to
decf8e6a8b ("all: Remove the "STATIC"
macro and just use "static" instead."), with no documentation or python
scripts to change to reflect the new macro name.
Signed-off-by: Alessandro Gatti <a.gatti@frob.it>
Previously to this commit, running the test suite on a bare-metal board
required specifying the target (really platform) and device, eg:
$ ./run-tests.py --target pyboard --device /dev/ttyACM1
That's quite a lot to type, and you also need to know what the target
platform is, when a lot of the time you either don't care or it doesn't
matter.
This commit makes it easier to run the tests by replacing both of these
options with a single `--test-instance` (`-t` for short) option. That
option specifies the executable/port/device to test. Then the target
platform is automatically detected.
The `--test-instance` can be passed:
- "unix" (the default) to use the unix version of MicroPython
- "webassembly" to test the webassembly port
- anything else is considered a port/device to pass to Pyboard
There are also some shortcuts to specify a port/device, following
`mpremote`:
- a<n> is short for /dev/ttyACM<n>
- u<n> is short for /dev/ttyUSB<n>
- c<n> is short for COM<n>
For example:
$ ./run-tests.py -t a1
Note that the default test instance is "unix" and so this commit does not
change the standard way to run tests on the unix port, by just doing
`./run-tests.py`.
As part of this change, the platform (and it's native architecture if it
supports importing native .mpy files) is show at the start of the test run.
Signed-off-by: Damien George <damien@micropython.org>
This commit makes it so that PyProxy objects are reused (on the JavaScript
side) when they correspond to an existing Python object that is the same
object.
For example, proxying the same Python function to JavaScript, the same
PyProxy instance is now used. This means that if `foo` is a Python
function then accessing it on the JavaScript side such as
`api.globals().get("foo")` has the property that:
api.globals().get("foo") === api.globals().get("foo")
Prior to this commit the above was not true because new PyProxy instances
were created each time `foo` was accessed.
Signed-off-by: Damien George <damien@micropython.org>
In JavaScript when accessing an attribute such as `obj.attr` a value of
`undefined` is returned if the attribute does not exist. This is unlike
Python semantics where an `AttributeError` is raised. Furthermore, in some
cases in JavaScript (eg a Proxy instance) `attr in obj` can return false
yet `obj.attr` is still valid and returns something other than `undefined`.
So the source of truth for whether a JavaScript attribute exists is to just
right away attempt `obj.attr`.
To more closely match these JavaScript semantics when proxying a JavaScript
object through to Python, change the attribute lookup logic on a `JsProxy`
so that it immediately attempts `obj.attr` instead of first testing if the
attribute exists via `attr in obj`.
This allows JavaScript objects which dynamically create attributes to work
correctly on the Python side, with both `obj.attr` and `obj["attr"]`. Note
that `obj["attr"]` already works in all cases because it immediately does
the subscript access without first testing if the attribute exists.
As a benefit, this new behaviour matches the Pyodide behaviour.
Signed-off-by: Damien George <damien@micropython.org>
This allows increasing the Python recursion depth if needed.
Also increase the default to 2k words. There is enough RAM in the
browser/node context for this to be increased, and having a larger pystack
allows more complex code to run without hitting the limit.
Signed-off-by: Damien George <damien@micropython.org>
In the webassembly port there is no asyncio run loop running at the top
level. Instead the Python asyncio run loop is scheduled through setTimeout
and run by the outer JavaScript event loop. Because tasks can become
runable from an external (to Python) event (eg a JavaScript callback), the
run loop must be scheduled whenever a task is pushed to the asyncio task
queue, otherwise tasks may be waiting forever on the queue.
Signed-off-by: Damien George <damien@micropython.org>
This change allows doing a top-level await on an asyncio primitive like
Task and Event.
This feature enables a better interaction and synchronisation between
JavaScript and Python, because `api.runPythonAsync` can now be used (called
from JavaScript) to await on the completion of asyncio primitives.
Signed-off-by: Damien George <damien@micropython.org>
This allows querying the GC heap size/used/free values, as well as the
number of alive JsProxy and PyProxy objects, referenced by proxy_c_ref and
proxy_js_ref.
Signed-off-by: Damien George <damien@micropython.org>
And clear the corresponding `proxy_c_ref[c_ref]` entry when the finaliser
runs. This then allows the C side to (eventually) garbage collect the
corresponding Python object.
Signed-off-by: Damien George <damien@micropython.org>
And clear the corresponding `proxy_js_ref[js_ref]` entry when the finaliser
runs. This then allows the JavaScript side to (eventually) free the
corresponding JavaScript object.
Signed-off-by: Damien George <damien@micropython.org>
So it's possible to know when an external C function is being called at the
top-level, eg by JavaScript without any intermediate C->JS->C calls.
Signed-off-by: Damien George <damien@micropython.org>
Instead of raising KeyError. These semantics match JavaScript behaviour
and make it much more seamless to pass Python dicts through to JavaScript
as though they were JavaScript {} objects.
Signed-off-by: Damien George <damien@micropython.org>
This adds a new undefined singleton to Python, that corresponds directly to
JavaScript `undefined`. It's accessible via `js.undefined`.
Signed-off-by: Damien George <damien@micropython.org>
This reverts part of commit fa23e4b093, to
make it so that Python `None` converts to JavaScript `null` (and JavaScript
`null` already converts to Python `None`). That's consistent with how the
`json` module converts these values back and forth.
Signed-off-by: Damien George <damien@micropython.org>
And change Py None conversion so it converts to JS undefined.
The semantics for conversion of these objects are then:
- Python None -> JavaScript undefined
- JavaScript undefined -> Python None
- JavaScript null -> Python None
This follows Pyodide:
https://pyodide.org/en/stable/usage/type-conversions.html
Signed-off-by: Damien George <damien@micropython.org>
This commit defines a new `JsException` exception type which is used on the
Python side to wrap JavaScript errors. That's then used when a JavaScript
Promise is rejected, and the reason is then converted to a `JsException`
for the Python side to handle.
This new exception is exposed as `jsffi.JsException`.
Signed-off-by: Damien George <damien@micropython.org>
JavaScript semantics are such that the caller of an async function does not
need to await that function for it to run to completion. This commit makes
that behaviour also apply to top-level async Python code run via
`runPythonAsync()`.
Signed-off-by: Damien George <damien@micropython.org>
The `reason` in a rejected promise should be an instance of `Error`. That
leads to better error messages on the JavaScript side.
Signed-off-by: Damien George <damien@micropython.org>
This allows a simple way to run the existing asyncio tests under the
webassembly port, which doesn't support `asyncio.run()`.
Signed-off-by: Damien George <damien@micropython.org>
This commit adds a significant portion of the existing MicroPython asyncio
module to the webassembly port, using parts of the existing asyncio code
and some custom JavaScript parts.
The key difference to the standard asyncio is that this version uses the
JavaScript runtime to do the actual scheduling and waiting on events, eg
Promise fulfillment, timeouts, fetching URLs.
This implementation does not include asyncio.run(). Instead one just uses
asyncio.create_task(..) to start tasks and then returns to the JavaScript.
Then JavaScript will run the tasks.
The implementation here tries to reuse as much existing asyncio code as
possible, and gets all the semantics correct for things like cancellation
and asyncio.wait_for. An alternative approach would reimplement Task,
Event, etc using JavaScript Promise's. That approach is very difficult to
get right when trying to implement cancellation (because it's not possible
to cancel a JavaScript Promise).
Signed-off-by: Damien George <damien@micropython.org>
When a Promise is rejected on the JavaScript side, the reject reason should
be thrown into the encapsulating generator on the Python side.
Signed-off-by: Damien George <damien@micropython.org>
An exception on the Python side should be passed to the Promise reject
callback on the JavaScript side.
Signed-off-by: Damien George <damien@micropython.org>
Otherwise Emscripten allocates it on the Emscripten C stack, which will
overflow for large amounts of code.
Fixes issue #14307.
Signed-off-by: Damien George <damien@micropython.org>
In modularize mode, the `_createMicroPythonModule()` constructor must be
await'ed on, before `Module` is ready to use.
Signed-off-by: Damien George <damien@micropython.org>
This optimises the case where a Python function is, for example, stored to
a JavaScript attribute and then later retrieved from Python. The Python
function no longer needs to be a proxy with double proxying needed for the
call from Python -> JavaScript -> Python.
Signed-off-by: Damien George <damien@micropython.org>