Skip to content

957498: Removed auto rendermode related content. #6144

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 17 commits into
base: development
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
17 commits
Select commit Hold shift + click to select a range
515bcd9
957498: Removed auto rendermode related content.
ArunKumar-SF3979 Jun 17, 2025
28c2f83
957498: Removed auto rendermode related content.
ArunKumar-SF3979 Jun 17, 2025
a709cfe
Merge branch 'development' into 957498-SmartComponents
ArunKumar-SF3979 Jun 17, 2025
a946cc7
Merge branch 'development' into 957498-SmartComponents
ArunKumar-SF3979 Jun 18, 2025
c617d4c
957498: Removed auto rendermode related content.
ArunKumar-SF3979 Jun 18, 2025
2e8771b
Merge branch '957498-SmartComponents' of https://github.com/syncfusio…
ArunKumar-SF3979 Jun 18, 2025
a87e1f6
Merge branch 'development' into 957498-SmartComponents
ArunKumar-SF3979 Jun 23, 2025
ee89597
Merge branch 'development' into 957498-SmartComponents
ArunKumar-SF3979 Jun 23, 2025
ea4c761
957498: Added proper title for all custom services.
ArunKumar-SF3979 Jun 23, 2025
e2bef98
Merge branch '957498-SmartComponents' of https://github.com/syncfusio…
ArunKumar-SF3979 Jun 23, 2025
6943722
957498: Added proper title for all custom services.
ArunKumar-SF3979 Jun 23, 2025
6cfecdb
957498: Added proper title for all custom services.
ArunKumar-SF3979 Jun 23, 2025
f5a82fc
Merge branch 'development' into 957498-SmartComponents
ArunKumar-SF3979 Jun 25, 2025
36b5f4e
Merge branch 'development' into 957498-SmartComponents
ArunKumar-SF3979 Jun 27, 2025
1edee99
Merge branch 'development' into 957498-SmartComponents
ArunKumar-SF3979 Jun 30, 2025
6f8eccf
Merge branch 'development' into 957498-SmartComponents
ArunKumar-SF3979 Jul 2, 2025
96a1362
Merge branch 'development' into 957498-SmartComponents
Karthigaiselvi-SF2856 Jul 10, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 18 additions & 11 deletions blazor/smart-paste/getting-started-webapp.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ For example, in a Blazor Web App with the `Auto` interactive render mode, use th
{% tabs %}
{% highlight c# tabtitle="Blazor Web App" %}

dotnet new blazor -o BlazorWebApp -int Auto
dotnet new blazor -o BlazorWebApp -int Server
cd BlazorWebApp
cd BlazorWebApp.Client

Expand All @@ -81,7 +81,7 @@ N> For more information on creating a **Blazor Web App** with various interactiv

## Install Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor SmartComponents and Themes NuGet in the App

If you utilize `WebAssembly` or `Auto` render modes in the Blazor Web App need to be install Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor components NuGet packages within the client project.
If you utilize `Server` render modes in the Blazor Web App need to be install Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor components NuGet packages within the server project.

* Press <kbd>Ctrl</kbd>+<kbd>`</kbd> to open the integrated terminal in Visual Studio Code.
* Ensure you’re in the project root directory where your `.csproj` file is located.
Expand Down Expand Up @@ -109,7 +109,6 @@ N> Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor components are availa

| Interactive Render Mode | Description |
| -- | -- |
| WebAssembly or Auto | Open **~/_Imports.razor** file from the client project.|
| Server | Open **~/_import.razor** file, which is located in the `Components` folder.|

Import the `Syncfusion.Blazor` and `Syncfusion.Blazor.SmartComponents` namespace .
Expand All @@ -123,10 +122,6 @@ Import the `Syncfusion.Blazor` and `Syncfusion.Blazor.SmartComponents` namespace
{% endhighlight %}
{% endtabs %}

Now, register the Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Service in the **~/Program.cs** file of your Blazor Web App.

If the **Interactive Render Mode** is set to `WebAssembly` or `Auto`, you need to register the Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor service in both **~/Program.cs** files of your Blazor Web App.

If the **Interactive Render Mode** is set to `Server`, your project will contain a single **~/Program.cs** file. So, you should register the Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Service only in that **~/Program.cs** file.

{% tabs %}
Expand Down Expand Up @@ -166,7 +161,7 @@ builder.Services.AddSyncfusionBlazor();

string apiKey = "api-key";
string deploymentName = "deployment-name";
string endpoint = "end point url";
string endpoint = "end point url";// Must be null for OpenAI

builder.Services.AddSyncfusionSmartComponents()
.ConfigureCredentials(new AIServiceCredentials(apiKey, deploymentName, endpoint))
Expand All @@ -190,9 +185,21 @@ N> From version 28.2.33, the Azure.AI.OpenAI package has been removed from the S

If you are using **OpenAI**, [create an API key](https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key) and place it at `apiKey`, leave the `endpoint` as `""`. The value for `deploymentName` is the [model](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models) you wish to use (e.g., `gpt-3.5-turbo`, `gpt-4`, etc.).

### Using Ollama
### Configuring Ollama for Self-Hosted AI Models

To use Ollama for running self-hosted models:

1. **Download and install Ollama**
Visit [Ollama's official website](https://ollama.com) and install the application appropriate for your operating system.

2. **Install the desired model from the Ollama library**
You can browse and install models from the [Ollama Library](https://ollama.com/library) (e.g., `llama2:13b`, `mistral:7b`, etc.).

3. **Configure your application**

If you are using [Ollama](https://ollama.com/), set SelfHosted to true and provide the Endpoint URL where the model is hosted (e.g., http://localhost:11434). The value for DeploymentName should be the specific [model](https://ollama.com/library) you wish to use (e.g., `llama2:13b`, `mistral:7b`, etc.).
- Set `SelfHosted` to `true`.
- Provide the `Endpoint` URL where the model is hosted (e.g., `http://localhost:11434`).
- Set `DeploymentName` to the specific model you installed (e.g., `llama2:13b`).

Add the following settings to the **~/Program.cs** file in your Blazor Server app.

Expand Down Expand Up @@ -307,4 +314,4 @@ N> [View Sample in GitHub](https://github.com/syncfusion/smart-ai-samples).

## See also

* [Getting Started with Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Smart Paste Button Blazor Server App](https://blazor.syncfusion.com/documentation/)
* [Getting Started with Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Smart Paste Button Blazor Server App](https://blazor.syncfusion.com/documentation/smart-paste/getting-started-webapp)
20 changes: 16 additions & 4 deletions blazor/smart-paste/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ builder.Services.AddSyncfusionBlazor();

string apiKey = "api-key";
string deploymentName = "deployment-name";
string endpoint = "end point url";
string endpoint = "end point url";// Must be null for OpenAI

builder.Services.AddSyncfusionSmartComponents()
.ConfigureCredentials(new AIServiceCredentials(apiKey, deploymentName, endpoint))
Expand All @@ -169,9 +169,21 @@ N> From version 28.2.33, the Azure.AI.OpenAI package has been removed from the S

If you are using **OpenAI**, create an API key and place it at `apiKey`, leave the `endpoint` as `""`. The value for `deploymentName` is the model you wish to use (e.g., `gpt-3.5-turbo`, `gpt-4`, etc.).

### Using Ollama
### Configuring Ollama for Self-Hosted AI Models

If you are using [Ollama](https://ollama.com/), set SelfHosted to true and provide the Endpoint URL where the model is hosted (e.g., http://localhost:11434). The value for DeploymentName should be the specific [model](https://ollama.com/library) you wish to use (e.g., `llama2:13b`, `mistral:7b`, etc.).
To use Ollama for running self-hosted models:

1. **Download and install Ollama**
Visit [Ollama's official website](https://ollama.com) and install the application appropriate for your operating system.

2. **Install the desired model from the Ollama library**
You can browse and install models from the [Ollama Library](https://ollama.com/library) (e.g., `llama2:13b`, `mistral:7b`, etc.).

3. **Configure your application**

- Set `SelfHosted` to `true`.
- Provide the `Endpoint` URL where the model is hosted (e.g., `http://localhost:11434`).
- Set `DeploymentName` to the specific model you installed (e.g., `llama2:13b`).

Add the following settings to the **~/Program.cs** file in your Blazor Server app.

Expand Down Expand Up @@ -290,4 +302,4 @@ N> [View Sample in GitHub](https://github.com/syncfusion/smart-ai-samples).

## See also

* [Getting Started with Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Smart Paste Button Blazor Web App](https://blazor.syncfusion.com/documentation/)
* [Getting Started with Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Smart Paste Button Blazor Web App](https://blazor.syncfusion.com/documentation/smart-paste/getting-started)
31 changes: 19 additions & 12 deletions blazor/smart-textarea/getting-started-webapp.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,12 +66,12 @@ You can create a **Blazor Web App** using Visual Studio Code via [Microsoft Temp

You need to configure the corresponding [Interactive render mode](https://learn.microsoft.com/en-us/aspnet/core/blazor/components/render-modes?view=aspnetcore-8.0#render-modes) and [Interactivity location](https://learn.microsoft.com/en-us/aspnet/core/blazor/tooling?view=aspnetcore-8.0&pivots=vsc) while creating a Blazor Web Application.

For example, in a Blazor Web App with the `Auto` interactive render mode, use the following commands.
For example, in a Blazor Web App with the `Server` interactive render mode, use the following commands.

{% tabs %}
{% highlight c# tabtitle="Blazor Web App" %}

dotnet new blazor -o BlazorWebApp -int Auto
dotnet new blazor -o BlazorWebApp -int Server
cd BlazorWebApp
cd BlazorWebApp.Client

Expand All @@ -82,7 +82,7 @@ N> For more information on creating a **Blazor Web App** with various interactiv

## Install Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor SmartComponents and Themes NuGet in the App

If you utilize `WebAssembly` or `Auto` render modes in the Blazor Web App need to be install Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor components NuGet packages within the client project.
If you utilize `Server` render modes in the Blazor Web App need to be install Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor components NuGet packages within the server project.

* Press <kbd>Ctrl</kbd>+<kbd>`</kbd> to open the integrated terminal in Visual Studio Code.
* Ensure you’re in the project root directory where your `.csproj` file is located.
Expand Down Expand Up @@ -110,7 +110,6 @@ N> Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor components are availa

| Interactive Render Mode | Description |
| -- | -- |
| WebAssembly or Auto | Open **~/_Imports.razor** file from the client project.|
| Server | Open **~/_import.razor** file, which is located in the `Components` folder.|

Import the `Syncfusion.Blazor` and `Syncfusion.Blazor.SmartComponents` namespace.
Expand All @@ -124,10 +123,6 @@ Import the `Syncfusion.Blazor` and `Syncfusion.Blazor.SmartComponents` namespace
{% endhighlight %}
{% endtabs %}

Now, register the Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Service in the **~/Program.cs** file of your Blazor Web App.

If the **Interactive Render Mode** is set to `WebAssembly` or `Auto`, you need to register the Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor service in both **~/Program.cs** files of your Blazor Web App.

If the **Interactive Render Mode** is set to `Server`, your project will contain a single **~/Program.cs** file. So, you should register the Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Service only in that **~/Program.cs** file.

{% tabs %}
Expand Down Expand Up @@ -167,7 +162,7 @@ builder.Services.AddSyncfusionBlazor();

string apiKey = "api-key";
string deploymentName = "deployment-name";
string endpoint = "end point url";
string endpoint = "end point url";// Must be null for OpenAI

builder.Services.AddSyncfusionSmartComponents()
.ConfigureCredentials(new AIServiceCredentials(apiKey, deploymentName, endpoint))
Expand All @@ -191,9 +186,21 @@ N> From version 28.2.33, the Azure.AI.OpenAI package has been removed from the S

If you are using **OpenAI**, [create an API key](https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key) and place it at `apiKey`, leave the `endpoint` as `""`. The value for `deploymentName` is the [model](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models) you wish to use (e.g., `gpt-3.5-turbo`, `gpt-4`, etc.).

### Using Ollama
### Configuring Ollama for Self-Hosted AI Models

To use Ollama for running self-hosted models:

1. **Download and install Ollama**
Visit [Ollama's official website](https://ollama.com) and install the application appropriate for your operating system.

2. **Install the desired model from the Ollama library**
You can browse and install models from the [Ollama Library](https://ollama.com/library) (e.g., `llama2:13b`, `mistral:7b`, etc.).

3. **Configure your application**

If you are using [Ollama](https://ollama.com/), set SelfHosted to true and provide the Endpoint URL where the model is hosted (e.g., http://localhost:11434). The value for DeploymentName should be the specific [model](https://ollama.com/library) you wish to use (e.g., `llama2:13b`, `mistral:7b`, etc.).
- Set `SelfHosted` to `true`.
- Provide the `Endpoint` URL where the model is hosted (e.g., `http://localhost:11434`).
- Set `DeploymentName` to the specific model you installed (e.g., `llama2:13b`).

Add the following settings to the **~/Program.cs** file in your Blazor Server app.

Expand Down Expand Up @@ -277,4 +284,4 @@ N> [View Sample in GitHub](https://github.com/syncfusion/smart-ai-samples).

## See also

* [Getting Started with Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Smart TextArea in Blazor Server App](https://blazor.syncfusion.com/documentation/)
* [Getting Started with Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Smart TextArea in Blazor Server App](https://blazor.syncfusion.com/documentation/smart-textarea/getting-started-webapp)
20 changes: 16 additions & 4 deletions blazor/smart-textarea/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ builder.Services.AddSyncfusionBlazor();

string apiKey = "api-key";
string deploymentName = "deployment-name";
string endpoint = "end point url";
string endpoint = "end point url";// Must be null for OpenAI

builder.Services.AddSyncfusionSmartComponents()
.ConfigureCredentials(new AIServiceCredentials(apiKey, deploymentName, endpoint))
Expand All @@ -169,9 +169,21 @@ N> From version 28.2.33, the Azure.AI.OpenAI package has been removed from the S

If you are using **OpenAI**, create an API key and place it at `apiKey`, leave the `endpoint` as `""`. The value for `deploymentName` is the model you wish to use (e.g., `gpt-3.5-turbo`, `gpt-4`, etc.).

### Using Ollama
### Configuring Ollama for Self-Hosted AI Models

If you are using [Ollama](https://ollama.com/), set SelfHosted to true and provide the Endpoint URL where the model is hosted (e.g., http://localhost:11434). The value for DeploymentName should be the specific [model](https://ollama.com/library) you wish to use (e.g., `llama2:13b`, `mistral:7b`, etc.).
To use Ollama for running self-hosted models:

1. **Download and install Ollama**
Visit [Ollama's official website](https://ollama.com) and install the application appropriate for your operating system.

2. **Install the desired model from the Ollama library**
You can browse and install models from the [Ollama Library](https://ollama.com/library) (e.g., `llama2:13b`, `mistral:7b`, etc.).

3. **Configure your application**

- Set `SelfHosted` to `true`.
- Provide the `Endpoint` URL where the model is hosted (e.g., `http://localhost:11434`).
- Set `DeploymentName` to the specific model you installed (e.g., `llama2:13b`).

Add the following settings to the **~/Program.cs** file in your Blazor Server app.

Expand Down Expand Up @@ -256,4 +268,4 @@ N> [View Sample in GitHub](https://github.com/syncfusion/smart-ai-samples).

## See also

* [Getting Started with Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Smart TextArea in Blazor Web App](https://blazor.syncfusion.com/documentation/)
* [Getting Started with Syncfusion<sup style="font-size:70%">&reg;</sup> Blazor Smart TextArea in Blazor Web App](https://blazor.syncfusion.com/documentation/smart-textarea/getting-started)