You are given a reference to a node in a connected undirected graph. Return a deep copy (clone) of the graph. Each node in the graph contains a value (int) and a list (List[Node]) of its neighbors.
class Node {
public:
int val;
vector<Node*> neighbors;
Node() {
val = 0;
neighbors = vector<Node*>();
}
Node(int _val) {
val = _val;
neighbors = vector<Node*>();
}
Node(int _val, vector<Node*> _neighbors) {
val = _val;
neighbors = _neighbors;
}
};
Constraints:
nullptr
.The problem requires creating a deep copy of a graph. Each node in the graph is referenced by pointers in a vector of neighbors. We will use Depth-First Search (DFS) along with a hash map to keep track of already cloned nodes to avoid cycles and ensure each node is only cloned once.
Here’s a concise C++ solution using DFS:
#include <unordered_map>
#include <vector>
using namespace std;
class Node {
public:
int val;
vector<Node*> neighbors;
Node() {
val = 0;
neighbors = vector<Node*>();
}
Node(int _val) {
val = _val;
neighbors = vector<Node*>();
}
Node(int _val, vector<Node*> _neighbors) {
val = _val;
neighbors = _neighbors;
}
};
class Solution {
private:
unordered_map<Node*, Node*> clonedNodes;
Node* cloneGraphHelper(Node* node) {
if(!node) return nullptr;
// If node is already cloned, return cloned node
if(clonedNodes.find(node) != clonedNodes.end()) {
return clonedNodes[node];
}
// Clone the current node
Node* clonedNode = new Node(node->val);
clonedNodes[node] = clonedNode;
// Recursively clone the neighbors
for(Node* neighbor : node->neighbors) {
clonedNode->neighbors.push_back(cloneGraphHelper(neighbor));
}
return clonedNode;
}
public:
Node* cloneGraph(Node* node) {
return cloneGraphHelper(node);
}
};
The time complexity of this solution is (O(N + E)), where (N) is the number of nodes and (E) is the number of edges. This is because each node and edge are visited once during the DFS traversal.
This approach ensures a deep copy of the graph while maintaining efficiency.
Got blindsided by a question you didn’t expect?
Spend too much time studying?
Or simply don’t have the time to go over all 3000 questions?