-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to configure this code to get the result like the video on Youtube #7
Comments
Hi there, Unfortunately, the video found on youtube was produced in the old version of Nengo (v1.4), and that code hasn't been fully ported over to the Spaun2.0. You can get a similar display output using the As for your question, both the "eye" and the "arm" in the video you describe are 100% simulated on the computer. There was unfortunately, no interaction between the Spaun codebase and external hardware. |
I should note that it would not be too difficult to interface hardware to Spaun. Hooking up a camera to provide the stimulus input would require modifying the |
Thank you for your reply! |
Not a problem! 😄 |
Hi, I ran the code as what you says, but an error occurs. The error says, ImportError: No module named matplotlib_animation.matplotlib_anim in /_spaun/animation/nengo_anim.py. I googled this package but not found one. There is a similar package named 'matplotlib.animation' but it failed. So, where can I find this package. Thanks! |
Oh! Sorry. I forgot to mention that you need to clone this repo: https://github.com/xchoo/matplotlib_animation and put it in your
Note: You can do this as well:
to get the right directory structure. |
👌I'll try it! Thank you!
发自网易邮箱大师
On 05/31/2017 01:43, xchoo wrote:
Oh! Sorry. I forgot to mention that you need to clone this repo: https://github.com/xchoo/matplotlib_animation and put it in your spaun2.0 root directory. So it should look like this:
spaun2.0
+- _spaun
+- data
+- matplotlib_animation
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or mute the thread.
|
Hi,
I'm successfully run this code and get the result with string.
I've read the code and found the input is the def_str and the out_put will display on my console.
Could I implement the result that the input is provided by 'eye' and the output can be written by 'arm' like the video on Youtube? If so, how can I configure it and what hardware device should I use?
Thank you!
The text was updated successfully, but these errors were encountered: