Update, July 13: AI Assistant is available in pre-release versions, but is not bundled with the stable releases of JetBrains IDEs v.2023.2. It can be installed as a separate plugin available for versi
When you use AI features, the IDE needs to send your requests and code to the LLM provider. In addition to the prompts you type, the IDE may send additional details, such as pieces of your code, file types, frameworks used, and any other information that may be necessary for providing context to the LLM.
Doesn’t sound like it gives you much transparency or control over the data it sends no matter which feature you use. Sadly not usable at my job then.
They do say this though “We also plan to support local and on-premises models. For local models, the supported feature set will most likely be limited.”
This is currently a no go at my place (I asked) but the ai and security folks were interested in that as it would allow on-prem/private cloud usage as well as the possibility of using targeted models instead of a generic one.
For example, in the comments on their announcement they confirm they are looking at Azure AI support.
Doesn’t sound like it gives you much transparency or control over the data it sends no matter which feature you use. Sadly not usable at my job then.
That’s a bummer. We’re strictly regulated and stuff like this needs to be self hosted or we can’t use it
Removed by mod
They do say this though “We also plan to support local and on-premises models. For local models, the supported feature set will most likely be limited.”
This is currently a no go at my place (I asked) but the ai and security folks were interested in that as it would allow on-prem/private cloud usage as well as the possibility of using targeted models instead of a generic one.
For example, in the comments on their announcement they confirm they are looking at Azure AI support.