-
Notifications
You must be signed in to change notification settings - Fork 149
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Suppport for AWS SQS instead of opening up port on router #117
Comments
Interesting, I managed to get it to work as it would be good to not require the opening of ports Problem is it is another thing to have to install, maintain and support. But thanks for point it out, I have also looked at other ways to handle it such as AWS IoT (MQTT) and socket.io. But they too require a significant amount of setup. The easiest (for the end user) option would be to install a plugin on kodi and to oauth it with a dedicated server, which many would have trust issues with :) |
As long as these things are optional, I wouldn't mind, personally. For me though I like the way it is. |
+1 for both SQS and MQTT |
@badr-ghatasheh, just so I have a frame of reference.. why can't you port forward? This honestly seems even more difficult to set up than port forwarding.. |
A little update. I have a working version of a pubsub service using Python with AWS IoT. It is very simple to add to the Skill as it just replaces the SendCommand function (which you could define in the config) but instead of sending a HTTP request it publishes the request to a topic on AWS, with the service on your local machine subscribes to. When it spots the updated topic it relays the request to Kodi jsonrpc through your local network. My first challenge was to get it working as a proof of concept, and then explore the options. Ideally create a Kodi addon/script/service, with the worst case being having to run the python script manually somewhere on your local network. My main Kodi library is quite small as I keep it current rather than an archive. When I tried it with my dev Kodi library (adding 15,000 files to it) I spotted a major flaw. AWS limits the message size for MQTT and SQS to 256kb. Whilst my 142 movies was only 6kb, 3640 artists was 280kb, songs 730kb. So for now, MQTT/SQS is not an option. I doubt any of the freebie message brokers will allow big messages (pubnub has a 32kb limit). A private one would, but if you have a private MQTT broker then you probably have enough security knowledge to set up the port forwarding securely yourself. (Google seems to allow 10mb so i will have a look out of curiosity, but it not realistic to support having to set up a Google Cloud Platform account 😄 ) |
I use MQTT to send the alexa request to the skill, which I have running on my local network. This ensures the message size is small, and also that sequential calls to the Kodi API for searches etc are performed locally, minimising latency. This requires running a simple lambda function which proxies between alexa and MQTT and a subscriber of some sort on your local network, proxying the requests to/from the skill. It's a slightly convoluted setup, but was worth it to avoid opening a port on my network to the internet, which I wasn't willing to do. Here's the code I use to do both jobs, fwiw: https://github.com/jagsta/alexa-mqtt-utils |
Ah I see, what good idea. Rather than sending all the heavy requests from Kodi to the Skill via MQTT (which is what I was trying to do) you just send the light requests from Alexa to the Skill. Whilst I have had the Skill running locally (and securely) I was not happy about having port 443 open on my router. I was obviously even less happy about opening port 8080, so my current solution (also slightly convoluted) is to open a non standard port which is forwarded to a reverse proxy running on a Pi (Nginx). The proxy checks for ssl and that requests are from a specific suburl (a randomly generated 64 character suburl) of my ddns (self hosted). Nginx also which checks the request is from Amazon Lambda IP in the correct region. I have a different randomly generated url for each kodi install on my network, Nginx then forwards the request to the relevant local IP based on the suburl the request is coming from. (I know you didn't ask but I thought I would share in case others were looking for alternative methods) As your clever idea removes having to open any ports, I will definitely look at self hosting again. Ideally it would be nice to find a secure solution for the masses. I do worry how many people have mistakenly opened up ports or incorrectly setup port forwarding in their attempts to get this working or those that use kodi/kodi as the login details. Granted it is not our responsibility, and no matter what solution we provide, those same people might still use the default admin/admin login for their router. But it would be a step in the right direction 😄 |
The biggest issue is that whatever we do provide 'for the masses' needs to be easier (or close to as easy) than simply opening the port on the router; else, no one will use it anyway ;) edit: by 'no one' of course I mean those that do such things as use default passwords and don't care that it's sent plaintext. Honestly, it's a bit annoying (and odd) that Kodi's built-in web server doesn't at least support digest auth and/or TLS.. That would be "good enough" for most people, IMHO. |
I agree, it should be as easy, and definitely less risky for those users that might get a little click happy when trying to get something to work. As demonstrated in the forum where a user had purchased and installed some "port forwarding software" in an attempt to get it working..... fek knows what that was or did, the mind boggles 😕 |
By using an AWS SQS FIFO queue it should be possible to extend/modify the lambda-deploy script to provision the SQS queue as well as the lambda forwarder in a fairly straightforward manner. I think most peoples usage should fit into the free tier of SQS usage, as well as Lambda. I wonder if using a docker container to host both the skill and the local proxy which connects to the SQS queue could make the deployment easy enough for the average user? A deployment script which you ran on your local machine which pulls the docker container(s), docker runs them, and does the lambda deployment and SQS provisioning would be ideal. |
It would be nice if this skill would support AWS SQS so that you don't have to open up a port on your router. See https://github.com/rgraciano/echo-sonos/tree/master/echo-sqs-proxy
The text was updated successfully, but these errors were encountered: