Skip to content

Commit 618de54

Browse files
authored
Merge pull request #9079 from microsoft/ntrogh/sync-ignite-updates
Merge Ignite updates in repo
2 parents a5739b4 + 68d70d4 commit 618de54

File tree

17 files changed

+229
-136
lines changed

17 files changed

+229
-136
lines changed
Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
---
2+
Order: 120
3+
TOCTitle: "Announcing Private Marketplace for VS Code"
4+
PageTitle: "Announcing Private Marketplace for VS Code"
5+
MetaDescription: Private Marketplace for VS Code extensions now generally available.
6+
MetaSocialImage: PrivateMarketplaceHero.png
7+
Date: 2025-11-18
8+
Author: Sean Iyer
9+
---
10+
11+
# Introducing the Visual Studio Code Private Marketplace: Your Team's Secure, Curated Extension Hub 🎉
12+
13+
November 18, 2025 by [Sean Iyer](https://x.com/nuget)
14+
15+
Developers shouldn't have to juggle manual installs, worry about unverified extensions, or chase compliance exceptions just to get their favorite tools. Today, we're excited to announce Private Marketplace for VS Code—a dedicated, enterprise-ready hub that puts you in full control of how extensions are sourced, reviewed, and distributed to your dev teams. 🔐✨
16+
17+
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/nQLdmy50cb0?si=URhzNdCQ4a4zOSBx" title="Video showing the VS Code private marketplace." frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
18+
19+
## With the Private Marketplace, you can:
20+
21+
- Curate exactly which extensions appear for your organization, whether they're internal tools or trusted public plugins.
22+
- Lock down sensitive IP by privately hosting your own extensions—no more sharing proprietary code outside your firewall.
23+
- Easily rehost vetted public extensions after your own checks, so your teams can use familiar tools in secure or air-gapped environments.
24+
- Roll out everything centrally—new users get a consistent, up-to-date catalog from day one.
25+
26+
## Same great VS Code experience
27+
28+
Fully integrated into the VS Code experience, your developers will find, install, and update extensions as easily as ever—without leaving the safety of your managed ecosystem. No more ticket-based installs, zip file confusion, or security compromises.
29+
30+
![VS Code Private Marketplace](./vscpmga.png)
31+
32+
> [!NOTE]
33+
> Private Marketplace is available to GitHub Enterprise customers. VS Code users must sign in with a GitHub Enterprise or Copilot Business or Enterprise account to access.
34+
35+
## Getting started
36+
37+
Ready to streamline, secure, and scale your extension management? Start with our deployment guide now and bring peace of mind to both your developers and your IT team.
38+
39+
👉 See [deployment and feature guide](https://aka.ms/private-marketplace/readme) for instructions, scripts, and environment setup.
40+
41+
💬 Need help? Contact [private marketplace support](https://aka.ms/vspm/support).
42+
43+
### 🎯 Start today and empower your teams with a secure, streamlined extension experience!
44+
45+
Happy coding!
Lines changed: 3 additions & 0 deletions
Loading

blogs/2025/11/18/vscpmga.png

Lines changed: 3 additions & 0 deletions
Loading

docs/configure/extensions/extension-marketplace.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -364,6 +364,10 @@ As of VS Code release 1.97, when you first install an extension from a third-par
364364

365365
Get more information about [extension runtime security](/docs/configure/extensions/extension-runtime-security.md).
366366

367+
### Can I host extensions internally for my organization?
368+
369+
Yes, see the [Private Marketplace for Extensions](https://code.visualstudio.com/docs/setup/enterprise#_private-marketplace-for-extensions).
370+
367371
### The extension signature cannot be verified by VS Code
368372

369373
The Visual Studio Marketplace signs all extensions when they are published. VS Code verifies this signature when you install an extension to check the integrity and the source of the extension package.
@@ -372,9 +376,10 @@ The Visual Studio Marketplace signs all extensions when they are published. VS C
372376
> When you install an extension, you might see the following error message: `Cannot install extension because Visual Studio Code cannot verify the extension signature`. This error can be caused by a variety of reasons and should you encounter this error, exercise caution before deciding to install anyway. Disable extension signature verification with the `setting(extensions.verifySignature)` setting.
373377
374378
#### Package integrity issues
379+
375380
For package integrity issues, it's recommended that you contact the [Visual Studio Marketplace team](mailto:[email protected]?subject=Extension%20Signature%20Verification%20Issue) to report the issue. Make sure to include the extension ID. The following list provides error codes related to package integrity issues:
376381

377-
```
382+
```text
378383
PackageIntegrityCheckFailed
379384
SignatureIsInvalid
380385
SignatureManifestIsInvalid
@@ -389,6 +394,7 @@ NotSigned
389394
```
390395

391396
#### Other issues
397+
392398
For other issues like an unsupported environment or unknown reasons, it's recommended that you [report an issue](https://github.com/microsoft/vscode/issues/new) with VS Code by providing all necessary information and including the shared log: `kb(workbench.action.showCommands)` > **Open View...** > **Shared**.
393399

394400
### My extensions don't synchronize when connected to a remote window

docs/datascience/microsoft-fabric-quickstart.md

Lines changed: 109 additions & 103 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ MetaSocialImage: images/datascience/fabric-social.png
77

88
# Data science in Microsoft Fabric using Visual Studio Code
99

10-
You can build and develop data science and data engineering solutions for [Microsoft Fabric](https://learn.microsoft.com/fabric/) within VS Code. [Microsoft Fabric](https://marketplace.visualstudio.com/items?itemName=fabric.vscode-fabric) extensions for VS Code provide an integrated development experience for working with Fabric artifacts, lakehouses, notebooks, and user data functions.
10+
You can build and develop data science and data engineering solutions for [Microsoft Fabric](https://learn.microsoft.com/fabric/) within VS Code. [Microsoft Fabric](https://marketplace.visualstudio.com/items?itemName=fabric.vscode-fabric) extensions for VS Code provide an integrated development experience for working with Fabric artifacts, lakehouses, notebooks, and user data functions.
1111

1212
## What is Microsoft Fabric?
1313

@@ -36,23 +36,26 @@ You can find and install the extensions from the [Visual Studio Marketplace](htt
3636
| **Fabric Data Engineering** | Data engineers working with large-scale data & Spark | - Explore Lakehouses (tables, raw files)<br>- Develop/debug Spark notebooks<br>- Build/test Spark job definitions<br>- Sync notebooks between local VS Code & Fabric<br>- Preview schemas & sample data | You work with Spark, Lakehouses, or large-scale data pipelines and want to explore, develop, and debug locally. | [Develop Fabric notebooks in VS Code](https://learn.microsoft.com/fabric/data-engineering/setup-vs-code-extension) |
3737

3838
## Getting started
39+
3940
Once you have the extensions installed and signed in, you can start working with Fabric workspaces and items. In the Command Palette (`kb(workbench.action.showCommands)`), type **Fabric** to list the commands that are specific to Microsoft Fabric.
41+
4042
![Diagram that shows all microsoft Fabric commands](images/microsoft-fabric/fabric-command-palette.png)
4143

4244
## Fabric Workspace and items explorer
4345

4446
The Fabric extensions provide a seamless way to work with both remote and local Fabric items.
45-
- In the Fabric extension, the **Fabric Workspaces** section lists all items from your remote workspace, organized by type (Lakehouses, Notebooks, Pipelines, and more).
46-
- In the Fabric extension, the **Local folder** section shows a Fabric item(s) folder opened in VS Code. It reflects the structure of your fabric item definition for each type that is opened in VS Code. This enables you to develop locally and publish your changes to current or new workspace.
47+
48+
* In the Fabric extension, the **Fabric Workspaces** section lists all items from your remote workspace, organized by type (Lakehouses, Notebooks, Pipelines, and more).
49+
* In the Fabric extension, the **Local folder** section shows a Fabric item(s) folder opened in VS Code. It reflects the structure of your fabric item definition for each type that is opened in VS Code. This enables you to develop locally and publish your changes to current or new workspace.
4750

4851
![Screenshot that shows how to view your workspaces and items?](images/microsoft-fabric/view-workspaces-and-items.png)
4952

5053
## Use user data functions for data science
5154

5255
1. In the Command Palette (`kb(workbench.action.showCommands)`), type **Fabric: Create Item**.
53-
2. Select your workspace and select **User data function**. Provide a name and select **Python** language.
54-
3. You are notified to set up the Python virtual environment and continue to set this up locally.
55-
4. Install the libraries using `pip install` or select the user data function item in the Fabric extension to add libraries. Update the `requirements.txt` file to specify the dependencies:
56+
1. Select your workspace and select **User data function**. Provide a name and select **Python** language.
57+
1. You are notified to set up the Python virtual environment and continue to set this up locally.
58+
1. Install the libraries using `pip install` or select the user data function item in the Fabric extension to add libraries. Update the `requirements.txt` file to specify the dependencies:
5659

5760
```txt
5861
fabric-user-data-functions ~= 1.0
@@ -63,108 +66,111 @@ The Fabric extensions provide a seamless way to work with both remote and local
6366
joblib=1.2.0
6467
```
6568
66-
4. Open `functions_app.py`. Here's an example of developing a User Data Function for data science using scikit-learn:
69+
1. Open `functions_app.py`. Here's an example of developing a User Data Function for data science using scikit-learn:
6770
68-
```python
69-
import datetime
70-
import fabric.functions as fn
71-
import logging
72-
73-
# Import additional libraries
74-
import pandas as pd
75-
from sklearn.ensemble import RandomForestClassifier
76-
from sklearn.preprocessing import StandardScaler
77-
from sklearn.model_selection import train_test_split
78-
from sklearn.metrics import accuracy_score
79-
import joblib
80-
81-
udf = fn.UserDataFunctions()
82-
@udf.function()
83-
def train_churn_model(data: list, targetColumn: str) -> dict:
84-
'''
85-
Description: Train a Random Forest model to predict customer churn using pandas and scikit-learn.
86-
87-
Args:
88-
- data (list): List of dictionaries containing customer features and churn target
89-
Example: [{"Age": 25, "Income": 50000, "Churn": 0}, {"Age": 45, "Income": 75000, "Churn": 1}]
90-
- targetColumn (str): Name of the target column for churn prediction
91-
Example: "Churn"
92-
93-
Returns: dict: Model training results including accuracy and feature information
94-
'''
95-
# Convert data to DataFrame
96-
df = pd.DataFrame(data)
97-
98-
# Prepare features and target
99-
numeric_features = df.select_dtypes(include=['number']).columns.tolist()
100-
numeric_features.remove(targetColumn)
101-
102-
X = df[numeric_features]
103-
y = df[targetColumn]
104-
105-
# Split and scale data
106-
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
107-
scaler = StandardScaler()
108-
X_train_scaled = scaler.fit_transform(X_train)
109-
X_test_scaled = scaler.transform(X_test)
110-
111-
# Train model
112-
model = RandomForestClassifier(n_estimators=100, random_state=42)
113-
model.fit(X_train_scaled, y_train)
114-
115-
# Evaluate and save
116-
accuracy = accuracy_score(y_test, model.predict(X_test_scaled))
117-
joblib.dump(model, 'churn_model.pkl')
118-
joblib.dump(scaler, 'scaler.pkl')
119-
120-
return {
121-
'accuracy': float(accuracy),
122-
'features': numeric_features,
123-
'message': f'Model trained with {len(X_train)} samples and {accuracy:.2%} accuracy'
124-
}
125-
126-
@udf.function()
127-
def predict_churn(customer_data: list) -> list:
128-
'''
129-
Description: Predict customer churn using trained Random Forest model.
130-
131-
Args:
132-
- customer_data (list): List of dictionaries containing customer features for prediction
133-
Example: [{"Age": 30, "Income": 60000}, {"Age": 55, "Income": 80000}]
134-
135-
Returns: list: Customer data with churn predictions and probability scores
136-
'''
137-
# Load saved model and scaler
138-
model = joblib.load('churn_model.pkl')
139-
scaler = joblib.load('scaler.pkl')
140-
141-
# Convert to DataFrame and scale features
142-
df = pd.DataFrame(customer_data)
143-
X_scaled = scaler.transform(df)
144-
145-
# Make predictions
146-
predictions = model.predict(X_scaled)
147-
probabilities = model.predict_proba(X_scaled)[:, 1]
148-
149-
# Add predictions to original data
150-
results = customer_data.copy()
151-
for i, (pred, prob) in enumerate(zip(predictions, probabilities)):
152-
results[i]['churn_prediction'] = int(pred)
153-
results[i]['churn_probability'] = float(prob)
154-
155-
return results
156-
```
71+
```python
72+
import datetime
73+
import fabric.functions as fn
74+
import logging
75+
76+
# Import additional libraries
77+
import pandas as pd
78+
from sklearn.ensemble import RandomForestClassifier
79+
from sklearn.preprocessing import StandardScaler
80+
from sklearn.model_selection import train_test_split
81+
from sklearn.metrics import accuracy_score
82+
import joblib
83+
84+
udf = fn.UserDataFunctions()
85+
@udf.function()
86+
def train_churn_model(data: list, targetColumn: str) -> dict:
87+
'''
88+
Description: Train a Random Forest model to predict customer churn using pandas and scikit-learn.
89+
90+
Args:
91+
- data (list): List of dictionaries containing customer features and churn target
92+
Example: [{"Age": 25, "Income": 50000, "Churn": 0}, {"Age": 45, "Income": 75000, "Churn": 1}]
93+
- targetColumn (str): Name of the target column for churn prediction
94+
Example: "Churn"
95+
96+
Returns: dict: Model training results including accuracy and feature information
97+
'''
98+
# Convert data to DataFrame
99+
df = pd.DataFrame(data)
100+
101+
# Prepare features and target
102+
numeric_features = df.select_dtypes(include=['number']).columns.tolist()
103+
numeric_features.remove(targetColumn)
104+
105+
X = df[numeric_features]
106+
y = df[targetColumn]
107+
108+
# Split and scale data
109+
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
110+
scaler = StandardScaler()
111+
X_train_scaled = scaler.fit_transform(X_train)
112+
X_test_scaled = scaler.transform(X_test)
113+
114+
# Train model
115+
model = RandomForestClassifier(n_estimators=100, random_state=42)
116+
model.fit(X_train_scaled, y_train)
117+
118+
# Evaluate and save
119+
accuracy = accuracy_score(y_test, model.predict(X_test_scaled))
120+
joblib.dump(model, 'churn_model.pkl')
121+
joblib.dump(scaler, 'scaler.pkl')
122+
123+
return {
124+
'accuracy': float(accuracy),
125+
'features': numeric_features,
126+
'message': f'Model trained with {len(X_train)} samples and {accuracy:.2%} accuracy'
127+
}
128+
129+
@udf.function()
130+
def predict_churn(customer_data: list) -> list:
131+
'''
132+
Description: Predict customer churn using trained Random Forest model.
133+
134+
Args:
135+
- customer_data (list): List of dictionaries containing customer features for prediction
136+
Example: [{"Age": 30, "Income": 60000}, {"Age": 55, "Income": 80000}]
137+
138+
Returns: list: Customer data with churn predictions and probability scores
139+
'''
140+
# Load saved model and scaler
141+
model = joblib.load('churn_model.pkl')
142+
scaler = joblib.load('scaler.pkl')
143+
144+
# Convert to DataFrame and scale features
145+
df = pd.DataFrame(customer_data)
146+
X_scaled = scaler.transform(df)
147+
148+
# Make predictions
149+
predictions = model.predict(X_scaled)
150+
probabilities = model.predict_proba(X_scaled)[:, 1]
151+
152+
# Add predictions to original data
153+
results = customer_data.copy()
154+
for i, (pred, prob) in enumerate(zip(predictions, probabilities)):
155+
results[i]['churn_prediction'] = int(pred)
156+
results[i]['churn_probability'] = float(prob)
157+
158+
return results
159+
```
160+
161+
1. Test your functions locally, by pressing `kbstyle(F5)`.
162+
1. In the Fabric extension, in **Local folder** , select the function and publish to your workspace.
157163
158-
6. Test your functions locally, by pressing `kbstyle(F5)`.
159-
7. In the Fabric extension, in **Local folder** , select the function and publish to your workspace.
160164
![Screenshot that shows how to publish your user data funtions item](./images/microsoft-fabric/publish-user-data-function.png)
161165
162166
Learn more about invoking the function from:
163-
- [Fabric Data pipelines](https://learn.microsoft.com/fabric/data-engineering/user-data-functions/create-functions-activity-data-pipelines)
164-
- [Fabric Notebooks](https://learn.microsoft.com/fabric/data-engineering/notebook-utilities#user-data-function-udf-utilities)
165-
- [An external application](https://learn.microsoft.com/fabric/data-engineering/user-data-functions/tutorial-invoke-from-python-app)
167+
168+
* [Fabric Data pipelines](https://learn.microsoft.com/fabric/data-engineering/user-data-functions/create-functions-activity-data-pipelines)
169+
* [Fabric Notebooks](https://learn.microsoft.com/fabric/data-engineering/notebook-utilities#user-data-function-udf-utilities)
170+
* [An external application](https://learn.microsoft.com/fabric/data-engineering/user-data-functions/tutorial-invoke-from-python-app)
166171
167172
## Use Fabric notebooks for data science
173+
168174
A Fabric notebook is an interactive workbook in Microsoft Fabric for writing and running code, visualizations, and markdown side-by-side. Notebooks support multiple languages (Python, Spark, SQL, Scala, and more) and are ideal for data exploration, transformation, and model development in Fabric working with your existing data in OneLake.
169175
170176
### Example
@@ -199,6 +205,7 @@ def train_logistic_from_spark(spark, csv_path):
199205
Refer to [Microsoft Fabric Notebooks](https://learn.microsoft.com/fabric/data-engineering/how-to-use-notebook) documentation to learn more.
200206

201207
## Git integration
208+
202209
Microsoft Fabric supports Git integration that enables version control and collaboration across data and analytics projects. You can connect a Fabric workspace to Git repositories, primarily Azure DevOps or GitHub, and only supported items are synced. This integration also supports CI/CD workflow to enable teams to manage releases efficiently and maintain high-quality analytics environments.
203210

204211
![GIF that shows how to use Git integration with User data functions](./images/microsoft-fabric/fabric-git-integration.gif)
@@ -207,12 +214,11 @@ Microsoft Fabric supports Git integration that enables version control and colla
207214

208215
Now that you have Microsoft Fabric extensions set up in VS Code, explore these resources to deepen your knowledge:
209216

210-
### Learn more about Microsoft Fabric
211217
* [Learn about Microsoft Fabric for Data Science](https://learn.microsoft.com/en-us/fabric/data-science/tutorial-data-science-introduction).
212218
* [Set up your Fabric trial capacity](https://learn.microsoft.com/fabric/fundamentals/fabric-trial)
213219
* [Microsoft Fabric fundamentals](https://learn.microsoft.com/fabric/fundamentals/fabric-overview)
214220

215-
### Community and support
221+
To engage with the community and get support:
216222

217223
* [Microsoft Fabric community forums](https://community.fabric.microsoft.com/)
218224
* [Fabric samples and templates](https://github.com/microsoft/fabric-samples)

0 commit comments

Comments
 (0)