What new features would you like to see in uLisp?


#23

So, it works … on a von Neumann machine like the ARM. The AVR, which needs the optimisation the most, is a Harvard machine, and would need some way to know when to follow conses into code memory. Annoying.


#24

For real life prototyping usage, there are real things we need, if it can be done, then ulisp will be more powerful than micropython, not just from language point of view. I see real power and potential. Embed dev boards will become cheaper and more capable in every year. We need real tools, and ulisp can be that tool!

My priorities :

  • Arrays
  • More sensor library support, including more common dht11, 22 kind sensors
  • More lcd display support, like nokia5110 lcd, it’s so common.
  • Detailed guide how can we add sensor support, how can we contribute?
  • Common camera sensor support, like ov2640,
  • k210 capable of so many good things. It has builtin fft chip. How can we use it with ulisp. Also how can we use built in microphones? Most k210 and adafruit boards comes with it.
  • Tinyml support. Most difficult one. New arm m4 boards comes with built-in sensors, k210 boards already had that. How can we add that?

Am I too mad? I hope not. If MicroPython can do, ulisp should be able to do more :)


#25

Thank you for your enthusiasm!

  • Arrays

This is definitely on the list of things to be added!

  • More sensor library support, including more common dht11, 22 kind sensors
  • More lcd display support, like nokia5110 lcd, it’s so common.
  • Detailed guide how can we add sensor support, how can we contribute?

Currently there’s no way for users to upload new sensors to the library, but if you post them on the forum I’ll add them to the library.

Perhaps there should be a style guide for defining sensor interfaces; such as how they should be named, etc.

  • Common camera sensor support, like ov2640,
  • k210 capable of so many good things. It has builtin fft chip. How can we use it with ulisp. Also how can we use built in microphones? Most k210 and adafruit boards comes with it.

I thought about this when I was working on the K210 boards. To be useful the camera and microphone need to be interfaced in a way that would allow uLisp programs to access and process their data, and I couldn’t see an obvious way to do this. Any suggestions?

  • Tinyml support. Most difficult one. New arm m4 boards comnes with built-in sensors, k210 boards already had that. How can we add that?

I’m not sure how that could work. Suggestions?

Thanks, David


#26

MicroPython does not even attempt to support the more memory-constrained systems that uLisp does. I suspect that it is also implemented in a way that mirrors CPython by compiling to bytecode and then interpreting that, rather than being entirely an interpreter like uLisp is. MicroPython (and CircuitPython) are also not built on top of the Arduino environment, which can be frustratingly inefficient.

I’m not saying it’s impossible to make uLisp match or exceed them in capabilities, just that these are things to keep in mind.


#27

You’re welcome. Thanks for your fantastic work.

I agree and hope to have it. Then we can help more.

I couldn’t see either. Seems like only Chinese knows. Like how to work with ESP WiFi’s. It’s only on paper.

It seems they builded all Arduino core from Kendryte Standalone SDK. Some code - especially related to DL algorithms looks like closed licensed.

So if there will be any starting point, it should be first Arduino core - which is mostly hit and miss(but some things interestingly works well like lvgl) - then Standalone SDK.

About TinyML - the book : https://www.amazon.com/TinyML-Learning-TensorFlow-Ultra-Low-Power-Microcontrollers/dp/1492052043

The Code and explanations - boards - sensors - everything : https://www.tensorflow.org/lite/microcontrollers

Basically gathering rich sensor data from microcontrollers and using that for creating and training Deep Learning models, then optimizing these models for microcontrollers. End goal is limiting cloud dependency, more intelligent IoT devices, etc. Which I imagine you all already know.

All TinyML supported boards are already supported by ulisp. We just need more sensor driver - interface, etc. If we have that, we can quickly prototype with them using ulisp and it will become much more important than we can imagine.

This is the getting starting guide for more curious : https://www.tensorflow.org/lite/microcontrollers/get_started


#28

I should add; it kills me to knowing first neural network codes mostly written with lisp: I have Tanimoto’s book, and I would love to work on Deep Learning with Lisp, even a tiny bit, maybe becomes possible with ulisp…


#29

Thank you for all the suggestions. There’s a beta version of ARM uLisp containing some of the requested features:


#30

A post was split to a new topic: Contributing to the Sensor Library


#31

I bought Particle Xenon from a local shop; because it’s retired I can be able to find cheap : https://www.sparkfun.com/products/retired/15070

The Board has powerful features, especially onboard 4MB SPI Flash, great for ulisp, here is the list of features :

  • Nordic Semiconductor nRF52840 SoC
    • ARM Cortex-M4F 32-bit processor @ 64MHz
    • 1MB flash, 256KB RAM
    • IEEE 802.15.4 - 2006: 250Kbps
    • Bluetooth 5: 2Mbps, 1Mbps, 500Kbps, 125Kbps
    • Supports DSP insturctions, HW accelerated Floating Point Unit (FPU) calculations
    • ARM TrustZone CryptoCell-310 Cryptographic and security module
    • Up to +8 dBm TX power (down to -20dBm in 4dB steps)
    • NFC-A tag
  • On-board additional 4MB SPI flash
  • 20 mixed signal GPIO (6x Analog, 8x PWM), UART, I2C, SPI
  • Integrated Li-Po charging and battery connector
  • JTAG (SWD) Connector
  • RGB status LED
  • Reset and Mode buttons
  • On-board PCB antenna for mesh network
  • u.FL connectors for external antennas for mesh network, and NFC
  • FCC, CE, and IC certified. RoHS compliant

I bought this because with your array support on ARM, and addition of my sensors, I’m hoping I can try some TinyML demos, because Ulisp save image - load image capability is there . When I recieve the board, I’ll try with ulisp then I’ll share my experiences.

I hope you can add tensorflowlite for microcontrollers libraries to ulisp, especially for that chip. The board has enough ram for extra stuff. I’ll definitely love to demo that if it can be there.

Thanks for all of your effort.


#32

Looks interesting - I’ve found one at a pretty good price too and ordered it.


#33

Yeah, it priced 11$ at particle store : https://store.particle.io/products/xenon

If you add to these sensors on below to the board, you can do all TinyML Demos, which I have most of them:

  • VC0706 TTL Camera
  • IMU|LSM9DS1
  • Microphone - MP34DT05
  • Gesture, light, proximity - APDS9960

Mems microphone seems not very effective on demos, there for I will use MAX9814 which Adafruit used on their demos, other than that, it will look same I hope.

I can use your help with some of these sensors, they are not yet supported in ulisp, which I intent to use.

You can review all of this from here :

Youtube Demo :


#34

They also used that, which I forget to add to my answer :

  • Stereo 3.7W Class D Audio Amplifier - MAX98306

and that too :

Electret Microphone Amplifier - MAX4466 with Adjustable Gain

I had MAX9814, but it’s enough for the job.


#35

OK, a feature idea: a buffer of e.g. the last 5-10 commands given to ulisp. And when you press some definable key, for instance up and down (but should be user-definable in the source), scroll up and down through this buffer. The command should be presented without immediately being executed, requiring the user to either press Enter, or to - edit it. :) It could be based NOT on “physical lines” (the way readline handles it), but based on “entire s-expressions”.

Now, WHY would such a facility be needed? Well, in order to test a function with complex inputs. Then, you can just re-adjust the function arguments and simply hit Enter, instead of typing off all the function with all arguments again.


#36

Interesting! Are there any other IDEs that provide this feature?


#37

SLIME has a rather extensive history facility, which is indeed based on complete inputs rather than line-oriented.

I do suspect that this is something that would be much easier to provide in an off-chip IDE, rather than building it into uLisp.


#38

Anything you fire up with “rlwrap” on Unixoids does something like that: e.g. rlwrap sbcl, rlwrap ecl — this gives “line edit history” to ANY program that by itself does not have it. Just the orientation to a “line” instead of a sexp might not be ideal for Lisp, as it often gives you a “line” that is an incomplete form. I think this is a matter of preference.


#39

How 'bout having Control-C abort a running function
and/or abort the current input?

Say I’m entering a multi-line function and “CRAP!” I see
a misteak :) on the previous line. Just lemme hit a CTRL-C
and go back to the REPL prompt. Thoughts?

    -Rusty-