Alexa Skills - Developer Voice And Vote

Welcome to the Alexa Skills Feature Request site! This site enables Alexa Skills Developers to request and vote on features you’d like to see in the developer toolset and services for Alexa.

To keep this site purpose-driven and actionable, we moderate requests. Here’s some guidance on how to create great feature requests that can be evaluated by our development teams. For conversation, dialogue or help, you should visit our Alexa forums. We appreciate your input.

-Alexa Skills team

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. NLU Annotation set tool user improvements

    Couple of things that could make it easier from a user perspective if you don't mind the feedback

    You can only see 10 utterances per page, would be great to be able to see 50 / 100 per page

    Would be good to filter on the columns, e.g. filter, all the expected intents with "xyz"

    Would be good to duplicate an notation set to then use again with edits

    Would be good to be able to download an annotation set as a CSV, so you can edit and then upload again

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  2. Do not close skill unless I give command

    Currently, extending the the session timeout duration is not a supported feature of Alexa Skills Kit.

    Yet, I hope that a skill is closed only when I give certain command, so that I don't need to activate it again. (I only activate one skill at a time)

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  3. Use dynamic entities with slot elicitation in order to refine choice

    I'd like to set a dynamic entity with specific answers, and then prompt for the slot with that type. This isn't possible because of this error:"No other directives are allowed to be specified with a Dialog directive. The following Directives were returned: [Dialog.UpdateDynamicEntities]"

    The use case is refining the customer's choice where lots of the choices have similar values. The dynamic entity is being used to distinguish between them.

    2 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  4. Context Awareness (Room Awareness) for Any Device placed into a Group with Echo Device

    There are multiple Asks from other Users over a number years, yet Amazon isn’t acting or reacting.

    Alexa Context Awareness (Room Awareness) for any device placed in a group + Echo device, to control using natural utterance as long as it is in a group using the object name. (same as lights/switches).

    Alexa supports Lights, Switches, Thermostats - today. However myself and many others would like to Control the devices we choose to insert into a group with natural utterances, regardless if they are a Blind/Shade, Awning, Fan, drape etc but use the objects name vs having to utter Alexa…

    5 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  2 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  5. Provide barge-in flag on request

    As a developer, I would like a field on an incoming intent request payload that tells me whether that request was the result of a barge-in or not.

    Rationale:
    Today, there is no way to know whether an incoming intent in a live session was the result of the user completely listening to the previous response, or the result of the user doing a barge-in by invoking their wake word. There are several use cases where this information would be useful, but the two most important are:


    1. If you were giving instructions to your user at the end of your…
    12 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  6. Including post-ER slot value in NLU annotation sets

    In my slot values I use lots of synonyms and it would be much easier to maintain the annotation sets if I could specify the canonical string for the expected slot value. In other words, if in my slot value definition I have the following value:

    ```
    {
    "name": {

    "value": "human",
    
    "synonyms": [
    "girl",
    "boy",
    "woman",
    "man"
    ]

    }
    },
    ```
    I'd love to be able to write in my annotation set


    utterance Intent slot[creature]
    I want to be a boy CreatureSelectionIntent human
    Let's try a woman CreatureSelectionIntent human

    The canonical values are resolved at some point anyway, and…

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  7. Add support to SetNavigation directive in nativally.

    Currently I saw the model doesn't offer support to SetNavigation directive natively, so to use it's needed an work around to do this.

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  8. Using a skill's supported invocation phrases should enable and lauch into the skill.

    Currently when attempting to interact with a new skill that has not been manually enabled with either "{launch word} {skill name}" or "{Intent} with {skill name}" Alexa will respond that she does not know. The platform should support matching on skill name and either auto enable the skill or prompt the user to enable and use the skill. If the user chooses to enable the skill, then immediately launch into the skill. This would reduce friction points from a user perspective and better support new skills on the marketplace. When Alexa merely responds with not knowing how to react to…

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  9. Allow developers to prioritize intents

    Currently the fallback intent is the only intent that can be triggered if no other intents are. The issue with this is no slots can be added to the fallback so no logic can be done based on what the user said. If we could have a custom intent intent between main functionality and the fallback, it would be a lot easier to make things more conversational, add custom nlp and add FAQ sections to skills.

    2 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  10. Person name is not taking properly

    I want to make an alexa skill which takes two person names and test the love percentage. But what I came to see that alexa is not accurate when it comes to the person name. Is there a way that I can give the person name using spelling For example : I want to give 'Tejasai' as a slot but alexa is taking it as 'Tejas', Instead I want to give slot (or input) as
    'T' 'E' 'J' ' A ' 'S' 'A' 'I'

    (spelling out each individual character) and it has to take the input as ' TEJASAI'.

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  11. Wait for further questions without wake up word

    Listen for further questions without the wake up word after Alexa answers a question.

    Wait for stop order without wake up word when alarms sound.

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  12. Follow me

    When I have asked Alexa to start playing a CD in one room, but I am moving to another room, I don’t want to restart from the beginning. I want to be able to say “Alexa, follow me to the kitchen” and have what I was doing in the living room appear instead in the kitchen. If I am going back and forth a lot, it may be nice to be able to say “add the kitchen” and have it play in both places, but then be able to remove one device by saying “stop playing in the kitchen”. I…

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  13. Increase the Dynamic Entities update limit for bigger intent slots updates

    When using dynamic entities (slots type whose values are updated through the Dynamic Entities API), there is a limit of 100 values, including synonyms. Over this limit, the update request is simply ignored.
    This limit is far not enough since it includes synonyms: if having 30 slots for instance, each slot having 10 synonyms, we already are over the limit (we have then 300 values to update).

    The skill I develop is a game where the user can interact with objects in the place or in his pockets. He often have near of 20 possible objects, and to make them…

    4 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  14. Passthrough mode - In local mode, forward/catch-all to an intent

    A feature called like "Passthrough": If a skill is in developer mode, only local (Alexa linked to the same account of the skill author?), forward any request to a predefined intent "Catch all", and if reply false, continue with the normal Alexa parser flow. For hobbyist and local/personal use.

    Reason:

    In my home, i have a custom home-automation system command(string)>response(string), with a Telegram bot (called "Memole"). Commands like "Temperature", "Power report", "Activate garden" etc.

    I have a lots of Alexa and i develop (note: for myself only) a Custom Skill and a Home Skill.

    Now, for the Telegram equivalent command…

    3 votes
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  15. Don't require constant customer dialog interaction for long-form content

    I would love to see some improvement to the possible interactions that can occur between an Alexa skill and customers. Currently, it relies heavily on a back and forth dialog with customers, but isn't well-suited to long-form content (like articles, which I'll use as an example throughout going forward). I'd like to be able to return a relatively small amount of an article at a time, and react to events emitted by the Alexa infrastructure that allow me to prepare the next chunk of the article and present it to the customer seamlessly, without any interruption.

    Say we have a…

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  16. Name-Free interaction for Private Skill

    We will like a Name-Free interaction for Private Skill.
    Since more users will use private skill, Name-Free interaction can make people easy to trigger and use private skills.

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  17. Interaction model size

    Increase the interaction model size from 2.5MB. This limits the usability to a great extent. It has a lot of scope from a business perspective if the size is increased.

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  18. add interaction model editor and skill settings to ask toolkit for vs code

    I believe a great addition to the ASK Toolkit for Visual Studio Code would be to bring more of the visual tools and editors within the IDE. It would be great to be able to work through the interaction model and add intents, utterances, slots, etc as well as the general skill properties and be able to validate them before deployment or building in the developer console that would update my local files. Then when I deploy my skill using ask-cli everything is in sync with my changes.

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  19. Add request/command invocation type to be sent for intents

    Allow an intent to receive information on whether or not the user had requested an intent or given a command.

    Especially for kid skills, the ability to alter the response based on the manners involved in invoking the intent could drastically alter the response and, if given as an aspect of the skill info on the marketplace, alter parents decision to enable it for their children.

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Received  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
  20. Make Follow Up listening duration configurable

    I'm not a native English speaker and it takes me longer to think about the command I'm going to say. As such, I often find the follow up mode unusable for me because Alexa stops listening sooner than I manage to formulate the next command in my head. It's also often a bit unclear when does Alexa start listening (in Follow Up) and when does it stop so I easily end up in ridiculous command-repeating situations.

    Making this listening time configurable would allow me to extend this duration while keeping it the same for quicker speakers.

    1 vote
    Sign in Sign in with: Amazon
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Merged  ·  0 comments  ·  Interaction Model  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4
  • Don't see your idea?

Feedback and Knowledge Base