Alexa Skills - Developer Voice And Vote

Welcome to the Alexa Skills Feature Request site! This site enables Alexa Skills Developers to request and vote on features you’d like to see in the developer toolset and services for Alexa.

To keep this site purpose-driven and actionable, we moderate requests. Here’s some guidance on how to create great feature requests that can be evaluated by our development teams. For conversation, dialogue or help, you should visit our Alexa forums. We appreciate your input.

-Alexa Skills team

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Context Awareness (Room Awareness) for Any Device placed into a Group with Echo Device

    There are multiple Asks from other Users over a number years, yet Amazon isn’t acting or reacting.

    Alexa Context Awareness (Room Awareness) for any device placed in a group + Echo device, to control using natural utterance as long as it is in a group using the object name. (same as lights/switches).

    Alexa supports Lights, Switches, Thermostats - today. However myself and many others would like to Control the devices we choose to insert into a group with natural utterances, regardless if they are a Blind/Shade, Awning, Fan, drape etc but use the objects name vs having to utter Alexa…

    5 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  2 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  2. Provide barge-in flag on request

    As a developer, I would like a field on an incoming intent request payload that tells me whether that request was the result of a barge-in or not.

    Rationale:
    Today, there is no way to know whether an incoming intent in a live session was the result of the user completely listening to the previous response, or the result of the user doing a barge-in by invoking their wake word. There are several use cases where this information would be useful, but the two most important are:


    1. If you were giving instructions to your user at the end of your…
    12 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  3. Use dynamic entities with slot elicitation in order to refine choice

    I'd like to set a dynamic entity with specific answers, and then prompt for the slot with that type. This isn't possible because of this error:"No other directives are allowed to be specified with a Dialog directive. The following Directives were returned: [Dialog.UpdateDynamicEntities]"

    The use case is refining the customer's choice where lots of the choices have similar values. The dynamic entity is being used to distinguish between them.

    2 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  4. NLU Annotation set tool user improvements

    Couple of things that could make it easier from a user perspective if you don't mind the feedback

    You can only see 10 utterances per page, would be great to be able to see 50 / 100 per page

    Would be good to filter on the columns, e.g. filter, all the expected intents with "xyz"

    Would be good to duplicate an notation set to then use again with edits

    Would be good to be able to download an annotation set as a CSV, so you can edit and then upload again

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  5. Do not close skill unless I give command

    Currently, extending the the session timeout duration is not a supported feature of Alexa Skills Kit.

    Yet, I hope that a skill is closed only when I give certain command, so that I don't need to activate it again. (I only activate one skill at a time)

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  6. Including post-ER slot value in NLU annotation sets

    In my slot values I use lots of synonyms and it would be much easier to maintain the annotation sets if I could specify the canonical string for the expected slot value. In other words, if in my slot value definition I have the following value:

    ```
    {
    "name": {

    "value": "human",
    
    "synonyms": [
    "girl",
    "boy",
    "woman",
    "man"
    ]

    }
    },
    ```
    I'd love to be able to write in my annotation set


    utterance Intent slot[creature]
    I want to be a boy CreatureSelectionIntent human
    Let's try a woman CreatureSelectionIntent human

    The canonical values are resolved at some point anyway, and…

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  7. Allow developers to prioritize intents

    Currently the fallback intent is the only intent that can be triggered if no other intents are. The issue with this is no slots can be added to the fallback so no logic can be done based on what the user said. If we could have a custom intent intent between main functionality and the fallback, it would be a lot easier to make things more conversational, add custom nlp and add FAQ sections to skills.

    2 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  8. Add support to SetNavigation directive in nativally.

    Currently I saw the model doesn't offer support to SetNavigation directive natively, so to use it's needed an work around to do this.

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  9. Using a skill's supported invocation phrases should enable and lauch into the skill.

    Currently when attempting to interact with a new skill that has not been manually enabled with either "{launch word} {skill name}" or "{Intent} with {skill name}" Alexa will respond that she does not know. The platform should support matching on skill name and either auto enable the skill or prompt the user to enable and use the skill. If the user chooses to enable the skill, then immediately launch into the skill. This would reduce friction points from a user perspective and better support new skills on the marketplace. When Alexa merely responds with not knowing how to react to…

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  10. Phonic Recognition

    This feature request comes off the back of https://forums.developer.amazon.com/questions/200304/phonic-recognition.html

    The most recognised and adopted method of teaching kids and grownups to learn english is by using phonics.

    Could Alexa also recognise phonics as well as words? This would allow a whole new type of interactive skill that could teach children to read by getting them to practice phonics.

    Reference from the UK government for teaching kids to read with sounds:

    https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachmentdata/file/190599/LettersandSounds-_DFES-00281-2007.pdf

    33 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  1 comment  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  11. Retrieve the PowerState info through the voice

    Currently, on Smart Home Skills it is not possible to retrieve the PowerState info through the voice; the only information I noticed that can be retrieved seems to be the status for smart locks and the temperature/setpoint for thermostats...
    The power state probably represents one of the most used interfaces for smart devices, and its status retrieval would be very useful for the user: in my opinion it's very limiting to interact by using the voice with a device without knowing if it is powered on or off.

    16 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  1 comment  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  12. Increase the Dynamic Entities update limit for bigger intent slots updates

    When using dynamic entities (slots type whose values are updated through the Dynamic Entities API), there is a limit of 100 values, including synonyms. Over this limit, the update request is simply ignored.
    This limit is far not enough since it includes synonyms: if having 30 slots for instance, each slot having 10 synonyms, we already are over the limit (we have then 300 values to update).

    The skill I develop is a game where the user can interact with objects in the place or in his pockets. He often have near of 20 possible objects, and to make them…

    4 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  13. Passthrough mode - In local mode, forward/catch-all to an intent

    A feature called like "Passthrough": If a skill is in developer mode, only local (Alexa linked to the same account of the skill author?), forward any request to a predefined intent "Catch all", and if reply false, continue with the normal Alexa parser flow. For hobbyist and local/personal use.

    Reason:

    In my home, i have a custom home-automation system command(string)>response(string), with a Telegram bot (called "Memole"). Commands like "Temperature", "Power report", "Activate garden" etc.

    I have a lots of Alexa and i develop (note: for myself only) a Custom Skill and a Home Skill.

    Now, for the Telegram equivalent command…

    3 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  14. Person name is not taking properly

    I want to make an alexa skill which takes two person names and test the love percentage. But what I came to see that alexa is not accurate when it comes to the person name. Is there a way that I can give the person name using spelling For example : I want to give 'Tejasai' as a slot but alexa is taking it as 'Tejas', Instead I want to give slot (or input) as
    'T' 'E' 'J' ' A ' 'S' 'A' 'I'

    (spelling out each individual character) and it has to take the input as ' TEJASAI'.

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  15. Provide full user utterance

    Provide full user utterance for every conversation whether Alexa recognizes the intents/slots or not. There are many cases where the Alexa NLU is getting it wrong and in our fulfillment code we could do a double check and greatly improve the user experience. But we can't double check if you won't tell us what the user says every time.

    As mentioned in forums and stackoverflow multiple times. I'm shocked that I couldn't find this when searching feature requests to date.

    141 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  20 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  16. Wait for further questions without wake up word

    Listen for further questions without the wake up word after Alexa answers a question.

    Wait for stop order without wake up word when alarms sound.

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  17. Customize Echo Show Home Screen

    The ability to change the size of the clock, display date and next calendar appt would be appreciated by those with accessibility issues. Maybe a special accessibility screen similar to Tap to Alexa that satys on screen

    9 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  18. Follow me

    When I have asked Alexa to start playing a CD in one room, but I am moving to another room, I don’t want to restart from the beginning. I want to be able to say “Alexa, follow me to the kitchen” and have what I was doing in the living room appear instead in the kitchen. If I am going back and forth a lot, it may be nice to be able to say “add the kitchen” and have it play in both places, but then be able to remove one device by saying “stop playing in the kitchen”. I…

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  19. Add multi-value and negative-value slots

    Let's say my user is ordering a burger:
    user: "I want a cheeseburger"
    alexa: "Alright. What toppings would you like?"
    user: "mayo, pickles and no onions"

    In this scenario I've got to create some number of custom topping slots. {topping1} {topping2} etc.. and then every iteration of sample utterances up to however many toppings might possibly exist on a burger.

    There has to be a better way. I don't want to ask the user for toppings one at a time until they say "no more, please". Collecting multiple values to the same slot would fix this.

    Next, in the same…

    7 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  20. Persist more than 24kb of data between intents in a session

    I'm getting intent failures with no corresponding error messages in logs (using alexa-skills-kit-sdk-for-nodejs and Lambda).

    In troubleshooting, it looks like I may be exceeding the max allowed size of the JSON Response object. I see from the docs that the total size of your response cannot exceed 24 kilobytes.

    I'm using session attributes to store data I need, and in some circumstances that data is larger than 24kb. The same documentation as referenced above says: When returning your response, you can include data you need to persist during the session in the sessionAttributes property. The attributes you provide are then…

    21 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  4 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4
  • Don't see your idea?

Feedback and Knowledge Base